LLM: giants behind AGI


 


Keywords: emergence, Transformers, "Attention is all you need", Encoder, Decoder, Generative AI,  classification task, parameters and tokens, prompt, large language models, generative pretrained transformers, BERT( Bidirectional encoder representation transformer).

AI, the rizz of 21st century, is it a big thing now? is AI gonna take over the world? how will AI replace us? will AI replace us? every guy who is tech aware knows about the fear and dystopian emotions that ai has instilled among us in last 4 years. I would say AI might take over us and we gonna be like slaves, karma comes back, doesn't it....

well to understand how AI will take over humanity, we first need to understand how it works? and how it replaces us. getting into details we reach two major AI tools Chatgpt and DALL-E.

One of them lies on the textual spectrum while the other lies on visual spectrum. This article DEALS WITH THE TEXTUAL SPECTRUM OR SIDE OF AI....



LET me be direct, the tech behind Chatgpt is "Transformer" its a neural network having 2 components encoder and decoder. It was introduced by engineers through an article named "Attention is all u need".

let me define it in another layer of details:" generative pretrained transformers: the decoder part does the generation task, and the model is pretrained long large amount of data, and then fine tuned for doing specific task, and finally transformer which has more than 100 layers of encoder decoder and billions of parameters AKA weights AKA the soul..."

    To reach a deeper level of details i provide the following link:

Attention is all you need: https://arxiv.org/abs/1706.03762

Comments

Popular posts from this blog

WORLD MODELS!