SeqToSeq

 






There has been a huge advancement in the field of NLP in the last 10-15 years, and these advancements are grown by models such as seq to seq models, which has huge applications in language translation, paragraph summarization.
These seq to seq models use search algorithms such as greedy search and beam search.
Architecture of seq to seq mode:
The seq-to-seq model has 2 layers, one is encoder and the other is decoder.
How does it work?
Encoder translates the input into a representation, and that representation then converts it into the desired output.
but what part of nlp is seq to seq models? they are actually under the field of neural machine translation.
NL Translation has mainly two broad divisions of models:
  1. 1.Neural machine translation
  2. 2. Statistical Machine Translation. 

Comments

Popular posts from this blog

WORLD MODELS!