소장자료
LDR | 01931cam a2200000 a | ||
001 | 0100544520▲ | ||
003 | OCoLC▲ | ||
005 | 20220103160341▲ | ||
007 | ta ▲ | ||
008 | 210707s2021 enka o 001 0 eng d▲ | ||
020 | ▼a9781801077651 (pbk.)▲ | ||
020 | ▼a1801077657▲ | ||
020 | ▼z1801078890▲ | ||
020 | ▼z9781801078894▼q(electronic bk.)▲ | ||
082 | 0 | 4 | ▼a006.35▼223▲ |
090 | ▼a006.35▼bY51m▲ | ||
100 | 1 | ▼aYıldırım, Savaş.▲ | |
245 | 1 | 0 | ▼aMastering transformers :▼bbuild SOTA models from scratch with advanced natural language processing techniques /▼cSavas Yildirim, Meysam Asgari-chenaghlu.▲ |
260 | ▼aBirmingham :▼bPackt Publishing,▼c2021.▲ | ||
300 | ▼axvi, 357 p. :▼bill. ;▼c24 cm▲ | ||
500 | ▼aIncludes Index.▲ | ||
505 | 0 | ▼aFrom Bag-of-Words to the Transformers -- A Hands-On Introduction to the Subject -- Autoencoding Language Models -- Autoregressive and Other Language Models -- Fine-Tuning Language Models for Text Classification -- Fine-Tuning Language Models for Token Classification -- Text Representation -- Working with Efficient Transformers -- Cross-Lingual and Multilingual Language Modeling -- Serving Transformer Models -- Attention Visualization and Experiment Tracking▲ | |
650 | 0 | ▼aNatural language processing (Computer science)▲ | |
700 | 1 | ▼aAsgari-Chenaghlu, Meysam.▲ |

Mastering transformers : build SOTA models from scratch with advanced natural language processing techniques
자료유형
국외단행본
서명/책임사항
Mastering transformers : build SOTA models from scratch with advanced natural language processing techniques / Savas Yildirim, Meysam Asgari-chenaghlu.
발행사항
Birmingham : Packt Publishing , 2021.
형태사항
xvi, 357 p. : ill. ; 24 cm
일반주기
Includes Index.
내용주기
From Bag-of-Words to the Transformers -- A Hands-On Introduction to the Subject -- Autoencoding Language Models -- Autoregressive and Other Language Models -- Fine-Tuning Language Models for Text Classification -- Fine-Tuning Language Models for Token Classification -- Text Representation -- Working with Efficient Transformers -- Cross-Lingual and Multilingual Language Modeling -- Serving Transformer Models -- Attention Visualization and Experiment Tracking
ISBN
9781801077651 (pbk.) 1801077657
청구기호
006.35 Y51m
소장정보
예도서예약
서서가에없는책 신고
보보존서고신청
캠캠퍼스대출
우우선정리신청
배자료배달신청
문문자발송
출청구기호출력
학소장학술지 원문서비스
등록번호 | 청구기호 | 소장처 | 도서상태 | 반납예정일 | 서비스 |
---|
북토크
자유롭게 책을 읽고
느낀점을 적어주세요
글쓰기
느낀점을 적어주세요
청구기호 브라우징
관련 인기대출 도서