πŸ˜€ $\textbf{\textsf{Welcome to my Study Log!}}$

β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”

이 νŽ˜μ΄μ§€μ—μ„œλŠ” 더 λ‚˜μ€ μ—”μ§€λ‹ˆμ–΄λ§μ„ μœ„ν•œ λ‚˜λ¦„μ˜ 연ꡬ 기둝을 λͺ¨μ•„ κ³΅μœ ν•©λ‹ˆλ‹€. μ£Όμš” 논문리뷰와 기술적 μ‹€ν—˜μ΄ 주된 λ‚΄μš©μ΄λ©°, 각 μ±•ν„°μ—μ„œ 더 μžμ„Έν•œ λ‚΄μš©μ„ μ œκ³΅ν•©λ‹ˆλ‹€.

<aside> <img src="/icons/search_lightgray.svg" alt="/icons/search_lightgray.svg" width="40px" /> $\textbf{\textsf{More about me!}}$

πŸ“„ Resume

πŸ“š Portfolio

πŸ‘¨πŸ»β€πŸ’» Github

</aside>



πŸ“ $\textbf{\textsf{Paper \space Review}}$

β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”

<aside> <img src="/icons/book_lightgray.svg" alt="/icons/book_lightgray.svg" width="40px" /> μ£Όμš” 연ꡬ 논문듀을 주제 λ³„λ‘œ μ •λ¦¬ν•˜λ©° 연ꡬ 동ν–₯을 μ‚΄ν•λ‹ˆλ‹€

</aside>

Notable Model


Transformer

GPT 1

BERT


Pre-trained Language Model


XLNet

BART

T5


Efficient Model


ALBERT

DistilBERT

BigBird


Training Strategy


Sem-Ent

LoRA

FLAN