| »óǰ ¾È³» ¹× ȯºÒ, ±³È¯, ¹è¼Û¹®ÀÇ | |
| - °¡°Ô ÀüȹøÈ£ : | 1544-1900 |
| - Àüȹ®ÀÇ ½Ã°£ : |
¿ÀÀü 9½ÃºÎÅÍ ¿ÀÈÄ 6½Ã±îÁö (¸ÅÁÖ ¿ù¿äÀÏ, È¿äÀÏ, ¼ö¿äÀÏ, ¸ñ¿äÀÏ, ±Ý¿äÀÏ, °øÈÞÀÏ Á¦¿Ü) |
| - °¡°Ô À̸ÞÀÏ : | ink@kyobobook.co.kr |
| - ÀÌ¿ë Åùèȸ»ç : | CJ´ëÇÑÅë¿î |
|
ÆÇ¸Å°¡°ÔÁ¤º¸ |
|
| - »ç¾÷ÀÚ¸í : | (ÁÖ)±³º¸¹®°í |
| - »ç¾÷ÀÚµî·Ï¹øÈ£ : | 102-81-11670 |
| - Åë½ÅÆÇ¸Å¾÷½Å°í : | 01-0653 |
|
- Çö±Ý¿µ¼öÁõ : ¹ß±Þ°¡´É |
|
|
ÀüÈÁÖ¹® ¹× °áÁ¦¹®ÀÇ |
|
| - ²ÉÇÇ´Â ¾ÆÄ§¸¶À» : | 1644-8422 |
|
°¡°Ô¿Í Á÷°Å·¡¸¦ ÇÏ½Ã¸é ²É¼ÛÀÌ Àû¸³ ¹× °¢Á¾ ÇýÅÿ¡¼ Á¦¿ÜµÇ°í, ¸¸ÀÏÀÇ ¹®Á¦°¡ ¹ß»ýÇÏ´Â °æ¿ì¿¡µµ ²É¸¶ÀÇ µµ¿òÀ» ¹ÞÀ¸½Ç ¼ö ¾ø½À´Ï´Ù. °¡°ÔÀÇ ºÎ´çÇÑ ¿ä±¸, ºÒ°øÁ¤ ÇàÀ§ µî¿¡ ´ëÇØ¼µµ ²É¸¶·Î Á÷Á¢ ÀüÈÁÖ¼¼¿ä. |
|
| »ó¼¼Á¤º¸ | ±¸¸ÅÈıâ (0) | »óǰQ&A (0) | ¹è¼Û/±³È¯/ȯºÒ ¾È³» |
Ã¥¼Ò°³2026³â 01¿ù 30ÀÏ Ãâ°£ | ISBN : 116073819X | 492ÂÊ | ±Ô°Ýèâ
»ó¼¼À̹ÌÁö![]() ÀúÀÚ¼Ò°³ÀúÀÚ : ÃÖ¼ºÀ±
Lucius Choi
¹Ì±¹ ͏®Æ÷´Ï¾Æ´ëÇб³ ¾î¹ÙÀÎ(UC Irvine) °ø°ú´ëÇÐ
Àü±â¡¤ÄÄÇ»ÅͰøÇаú ¼®»ç(M.Eng.)
ÄÄÇ»ÅͰøÇаú Çлç(B.S.)
°¨¼ö : ÀÌ¿øÂù
Wonchan Lee
°í·Á´ëÇб³ ÀüÀÚ°øÇаú Ph.D. (¿µ»óÁ¤º¸Ã³¸® ¹× ÀΰøÁö´É)
¼¿ï´ëÇб³ ÀΰøÁö´É´ëÇпø ±³¼ö
»ç´Ü¹ýÀÎ Çѱ¹ÀΰøÁö´É±â¼ú»ê¾÷Çùȸ ÇùȸÀå
°¨¼ö : Á¶Á¤ÈÆ
Jeonghun Cho, Ph. D.
Çѱ¹°úÇбâ¼ú¿ø ÀüÀÚÀü»êÇкΠ°øÇйڻç
Çö) °æºÏ´ëÇб³ ÀüÀÚ°øÇкΠ±³¼ö
¸ñÂ÷Chapter 0
Before You Start
1. Linux
2. Python ¹öÀü °ü¸®
3. Jupyter
4. CUDA/cuDNN
5. Windows Subsystem for Linux
Chapter 1
Statistics for Data
Analysis
1. Data Architecture
1.1. Introduction to AI world
1.2. RAM & VRAM*****
1.3. µ¥ÀÌÅÍ ÄÁÅ×À̳Ê
1.4. DataType
1.5. NaN
1.6. ndarray
1.7. Broadcasting
1.8. º¤ÅÍÈ
1.9. View & Copy
2. µ¥ÀÌÅÍ Á¾·ù ¹× ºÐ·ù
2.1. ÃøÁ¤ ôµµ
2.2. ȸ»öÁö´ë/ÇÔÁ¤(¿À°³³ä ¹æÁö¿¡ ´ëÇÑ ¼³¸í)
2.3. °áÁ¤ ¿¹½Ã
2.4. µ¥ÀÌÅÍ Æ÷¸Ë
2.5. Python¿¡¼ µ¥ÀÌÅ͸¦ Àбâ
3. Ãß·Ð Åë°è
3.1. ¸ñÀû°ú ¿ë¾î
3.2. °¡¼³°ËÁ¤
3.3. È¿°ú Å©±â¿Í ½Ç¹«Àû Á߿伺
4. »ó°üºÐ¼® / ȸ±ÍºÐ¼®
4.1. »ó°üºÐ¼®
4.2. ȸ±Í
4.3. Áø´Ü ÈÄ °°ÇÈ
4.4. Á¤±ÔÈ È¸±Í
4.5. ȸ±ÍÀÇ ¸ñÀû: ¿¹Ãø°ú ¼³¸í
5. Question
Chapter 2
Big Data and Machine
Learning
0. Introduction to Machine Learning
1. ±â°èÇнÀ °³³ä ¹× ÇÁ·Î¼¼½º ÀÌÇØ
1.1. ¹®Á¦Á¤ÀÇ¿Í ¸ñÇ¥ÇÔ¼ö
1.2. µ¥ÀÌÅÍ ¼ö¸íÁÖ±â¿Í ÆÄÀÌÇÁ¶óÀÎ
1.3. µ¥ÀÌÅÍ ºÐÇÒ Àü·«
2. ÇнÀº° ¸ðµ¨ ¼º´ÉÆò°¡ ÁöÇ¥ ¹× Æ©´× ¹æ¹ý ÀÌÇØ
2.1. ºÐ·ù(Classification)
2.2. ºÐ·ù ½Ç½À °úÁ¦: MNIST ¼Õ±Û¾¾ ¼ýÀÚ ÀνÄ
2.3. ȸ±Í(Regression)
2.4. ȸ±Í ¿¬½À: UCI Bike Sharing
2.5. Anomaly Detection
2.6. °úÁ¦: ÀÌ»óÄ¡ °¨Áö »ç·Ê
3. ÁöµµÇнÀ ºÐ·ù/ȸ±Í ¸ðµ¨ Ȱ¿ë
3.1. k-Nearest Neighbors Classifier
3.2. Linear Regression
3.3. Logistic Regression
3.4. Decision Tree
3.5. Random Forest
3.6. Support Vector Machine
3.7. Gradient Boosting
3.8. Naive Bayes
3.9. Questions
4. ºñÁöµµÇнÀ ¸ðµ¨ ÀÌÇØ ¹× Ȱ¿ë
4.1. K-means
4.2. Hierarchical Clustering
4.3. Density-based spatial clustering of applications with noise
4.4. Principal Component Analysis
4.5. t-distributed stochastic neighbor embedding
4.6. Isolation Forest
4.7. Local Outlier Factor
4.8. Questions
Chapter 3
Computer Vision
Programming
1. Computer Vision Concept & Image Architecture
1.1. Pixel and Video
1.2. »ö°ú ¹à±â
1.3. Image File Format
1.4. ¸ÞŸµ¥ÀÌÅÍ
1.5. Ä«¸Þ¶ó À̹ÌÁö ÆÄÀÌÇÁ¶óÀÎ
1.6. Ç¥º»È¿Í º¸°£
1.7. Ä«¸Þ¶ó¿Í ±âÇÏ
1.8. Computational Thinking+
1.9. Á֯ļö
2. À̹ÌÁö µ¥ÀÌÅÍ ±âº» ó¸®
2.1. ÀÔÃâ·Â°ú ½Ã°¢È
2.2. Region of Interest & Masking
2.3. ±â¿ï±â, ¿§Áö, ÄÚ³Ê
2.4. ¹à±â È÷½ºÅä±×·¥
2.5. ³ëÀÌÁî¿Í ÆòȰÈ
2.6. ÀÌÁøÈ
2.7. ¸ðÆú·ÎÁö
2.8. Á֯ļö ÇÊÅ͸µ
ºÎ·Ï OpenCV ÇÔ¼ö Á¤¸®
3. Questions
Chapter 4
Basic Deep Learning
0. Intro
0.1. Introduction to Deep Learning
0.2. °¡¼Ó Çϵå¿þ¾î¿Í µö·¯´×+
1. ÅÙ¼¿Í ÀÚµ¿¹ÌºÐ
1.1. Shape
1.2. dtype
1.3. ºê·Îµåij½ºÆÃ
1.4. View & Copy
1.5. º¤ÅÍÈ(Vectorization)
1.6. ±×·¡µð¾ðÆ®
2. ¼Õ½ÇÇÔ¼ö¿Í È®·üÀû Ãâ·Â
2.1. Logit
2.2. Sigmoid & Softmax
2.3. CrossEntropy
2.4. Threshold
3. ´ÙÃþ ÆÛ¼ÁÆ®·Ð(MLP)
3.1. Single Layer Perceptron
3.2. MLP
3.3. MLP on UCI Wine with Torch
3.4. MLP on UCI Wine with TensorFlow/Keras
3.5. Activation Function
3.6. Dropout
3.7. Backpropagation
4. Convolutional Neural Network
4.1. Convolution
4.2. Kernel
4.3. Stride
4.4. Padding
4.5. Pointwise Convolution
4.6. Normalization
4.7. Computational Cost
4.8. Pooling
4.9. ¼ö¿ë ¿µ¿ª
5. CNN Models
5.1. AlexNet(2012)
5.2. Inception(2014)
5.3. VGG-16(2014)
5.4. ResNet
5.5. MobileNet
5.6. UNet
6. ÀüÀÌÇнÀ ÀÌÇØ ¹× Ȱ¿ë
6.1. Transfer Learning
6.2. ÇнÀ ¾ÈÁ¤È
7. Questions
7.1. Part 1
7.2. Part 2
Chapter 5
Transfer Learning-based
Models
1. Object Detection
1.1. ¹®Á¦ Á¤ÀÇ
1.2. Æò°¡ ÁöÇ¥
1.3. Two-stage
1.4. One-Stage-You Only Look Once
1.5. Anchor vs Anchor-free
1.6. ÇнÀ ¿ä¼Ò
1.7. ¼º´É ÀúÇÏÀÇ ¿øÀεé
1.8. Pretrained backbone
2. You Only Look Once
2.1. YOLOÀÇ ±¸Á¶
2.2. ÀԷ°ú Ãâ·Â ÆÄÀÌÇÁ¶óÀÎ
2.3. YOLO ÇнÀ °¨°¢
2.4. YOLO Ã߷аú ¹èÆ÷
2.5. »çÀüÇнÀ YOLO¸¦ ³ª¸¸ÀÇ µ¥ÀÌÅÍ¿¡ ÀüÀÌ ÇнÀÇϱâ
3. GAN °³³ä ¹× ±¸Á¶ ÀÌÇØ¿Í Ȱ¿ë
3.1. Generative Adversarial Network
3.2. ±âº»¿¡¼ º¯Á¾ GANÀ¸·Î
3.3. GAN ÇнÀ °¨°¢
4. HuggingFace
4.1. HuggingFace ±â¹Ý ÀüÀÌÇнÀ
4.2. HuggingFace¿¡¼ ÀÚÁÖ ¾²´Â ¸ðµ¨µé
4.3. State of the Art
5. Questions
Chapter 6
NLP
1. Introduction to Natural Language Processing
1.1 Definition of NLP
1.2. ÀÌÇØ »ý¼º º¯È¯
1.3. ó¸® ´ÜÀ§
1.4. ±ÔÄ¢ ±â¹ÝÀÇ ÀüÅëÀû ÀÚ¿¬¾î ó¸®
1.5. ½Å°æ¸Á ±â¹Ý NLP
2. µ¥ÀÌÅÍ Àüó¸® °úÁ¤ ¹× ¹æ¹ý
2.1. ÅØ½ºÆ® Á¤±ÔÈ
2.2. English vs Korean
2.3. Tokenization
2.4. Á¤Á¦
2.5. ¶óº§ ǰÁú °ü¸®
3. RNN / LSTM ±¸Á¶ ÀÌÇØ ¹× Ȱ¿ë
3.1. ¼øÈ¯½Å°æ¸Á ±âº»
3.2. Long Short-Term Memory
3.3. Gated Recurrent Unit
3.4. Seq2Seq
4. Transformer
4.1. Attention is All You Need
4.2. Ç¥ÁØ Æ®·£½ºÆ÷¸Ó ¾ÆÅ°ÅØÃ³
4.3. Transformer °è¿°ú ÇнÀ ¸ñÇ¥
4.4. ¹Ì¼¼ Á¶Á¤
4.5. Optimization
4.6. Retrieval-Augmented Generation
5. Last Page |
| ±³È¯ ¹× ȯºÒ °¡´É |
»óǰ¿¡ ¹®Á¦°¡ ÀÖÀ» °æ¿ì |
1) »óǰÀÌ Ç¥½Ã/±¤°íµÈ ³»¿ë°ú ´Ù¸£°Å³ª ºÒ·®(ºÎÆÐ, º¯Áú, ÆÄ¼Õ, Ç¥±â¿À·ù, À̹°È¥ÀÔ, Áß·®¹Ì´Þ)ÀÌ ¹ß»ýÇÑ °æ¿ì - ½Å¼±½Äǰ, ³ÃÀå½Äǰ, ³Ãµ¿½Äǰ : ¼ö·ÉÀÏ ´ÙÀ½³¯±îÁö ½Åû - ±âŸ »óǰ : ¼ö·ÉÀϷκÎÅÍ 30ÀÏ À̳», ±× »ç½ÇÀ» ¾È ³¯ ¶Ç´Â ¾Ë ¼ö ÀÖ¾ú´ø ³¯·ÎºÎÅÍ 30ÀÏ À̳» ½Åû 2) ±³È¯ ¹× ȯºÒ½Åû ½Ã ÆÇ¸ÅÀÚ´Â »óǰÀÇ »óŸ¦ È®ÀÎÇÒ ¼ö ÀÖ´Â »çÁøÀ» ¿äûÇÒ ¼ö ÀÖÀ¸¸ç »óǰÀÇ ¹®Á¦ Á¤µµ¿¡ µû¶ó Àç¹è¼Û, ÀϺÎȯºÒ, ÀüüȯºÒÀÌ ÁøÇàµË´Ï´Ù. ¹Ýǰ¿¡ µû¸¥ ºñ¿ëÀº ÆÇ¸ÅÀÚ ºÎ´ãÀ̸ç ȯºÒÀº ¹ÝǰµµÂøÀϷκÎÅÍ ¿µ¾÷ÀÏ ±âÁØ 3ÀÏ À̳»¿¡ ¿Ï·áµË´Ï´Ù. |
|
´Ü¼øº¯½É ¹× ÁÖ¹®Âø¿ÀÀÇ °æ¿ì |
1) ½Å¼±½Äǰ, ³ÃÀå½Äǰ, ³Ãµ¿½Äǰ ÀçÆÇ¸Å°¡ ¾î·Á¿î »óǰÀÇ Æ¯¼º»ó, ±³È¯ ¹× ȯºÒÀÌ ¾î·Æ½À´Ï´Ù. 2) ÈÀåǰ ÇǺΠƮ·¯ºí ¹ß»ý ½Ã Àü¹®ÀÇ Áø´Ü¼ ¹× ¼Ò°ß¼¸¦ Á¦ÃâÇϽøé ȯºÒ °¡´ÉÇÕ´Ï´Ù. ÀÌ °æ¿ì Á¦¹Ýºñ¿ëÀº ¼ÒºñÀÚ ºÎ´ãÀ̸ç, ¹è¼Ûºñ´Â ÆÇ¸ÅÀÚ°¡ ºÎ´ãÇÕ´Ï´Ù. ÇØ´ç ÈÀåǰ°ú ÇǺΠƮ·¯ºí°úÀÇ »ó´çÇÑ Àΰú°ü°è°¡ ÀÎÁ¤µÇ´Â °æ¿ì ¶Ç´Â Áúȯġ·á ¸ñÀûÀÇ °æ¿ì¿¡´Â Áø´Ü¼ ¹ß±Þºñ¿ëÀ» ÆÇ¸ÅÀÚ°¡ ºÎ´ãÇÕ´Ï´Ù. 3) ±âŸ »óǰ ¼ö·ÉÀϷκÎÅÍ 7ÀÏ À̳» ½Åû, ¿Õº¹¹è¼Ûºñ´Â ¼ÒºñÀÚ ºÎ´ã 4) ¸ð´ÏÅÍ ÇØ»óµµÀÇ Â÷ÀÌ·Î »ö»óÀ̳ª À̹ÌÁö°¡ ´Ù¸¥ °æ¿ì ´Ü¼øº¯½É¿¡ ÀÇÇÑ ±³È¯ ¹× ȯºÒÀÌ Á¦ÇÑµÉ ¼ö ÀÖ½À´Ï´Ù. |
|
| ±³È¯ ¹× ȯºÒ ºÒ°¡ |
1) ½Åû±âÇÑÀÌ Áö³ °æ¿ì 2) ¼ÒºñÀÚÀÇ °ú½Ç·Î ÀÎÇØ »óǰ ¹× ±¸¼ºÇ°ÀÇ Àüü ¶Ç´Â ÀϺΰ¡ ¾ø¾îÁö°Å³ª ÈѼÕ, ¿À¿°µÇ¾úÀ» °æ¿ì 3) °³ºÀÇÏ¿© ÀÌ¹Ì ¼·ÃëÇÏ¿´°Å³ª »ç¿ë(Âø¿ë ¹× ¼³Ä¡ Æ÷ÇÔ)ÇØ »óǰ ¹× ±¸¼ºÇ°ÀÇ °¡Ä¡°¡ ¼Õ»óµÈ °æ¿ì 4) ½Ã°£ÀÌ °æ°úÇÏ¿© »óǰÀÇ °¡Ä¡°¡ ÇöÀúÈ÷ °¨¼ÒÇÑ °æ¿ì 5) »ó¼¼Á¤º¸ ¶Ç´Â »ç¿ë¼³¸í¼¿¡ ¾È³»µÈ ÁÖÀÇ»çÇ× ¹× º¸°ü¹æ¹ýÀ» ÁöŰÁö ¾ÊÀº °æ¿ì 6) »çÀü¿¹¾à ¶Ç´Â ÁÖ¹®Á¦ÀÛÀ¸·Î ÅëÇØ ¼ÒºñÀÚÀÇ ÁÖ¹®¿¡ µû¶ó °³º°ÀûÀ¸·Î »ý»êµÇ´Â »óǰÀÌ ÀÌ¹Ì Á¦ÀÛÁøÇàµÈ °æ¿ì 7) º¹Á¦°¡ °¡´ÉÇÑ »óǰ µîÀÇ Æ÷ÀåÀ» ÈѼÕÇÑ °æ¿ì 8) ¸À, Çâ, »ö µî ´Ü¼ø ±âÈ£Â÷ÀÌ¿¡ ÀÇÇÑ °æ¿ì |
|