Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>extension to transformers that can focus the attention on just the relevant context.

That is what transformers attention does in the first place, so you would just be stacking two transformers.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: