这是人神大作业&实验室的活,我和gjr、swx、zzd四人组队刚这个东西,同时有nyl、jfk学长carry。
官网:https://stanfordnlp.github.io/coqa/
一些相关论文:
- CoQA[√, 我的批注]: CoQA: A Conversational Question Answering Challenge
- self-attention mechanism[√, 我的批注]: A Structured Self-attentive Sentence Embedding
- BiDAF[√, 我的批注, gjr的批注]: Bidirectional Attention Flow for Machine Comprehension
- Dataset Comparison[√, 我的批注]: A Qualitative Comparison of CoQA, SQuAD 2.0 and QuAC
- BiDAF++: QuAC : Question Answering in Context
- FlowQA[√, 我的批注, 我的ppt, gjr的批注]: Grasping Flow in History for Conversational Machine Comprehension
- BERT: BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
- DrQA[gjr的批注]: Reading Wikipedia to Answer Open-Domain Questions
- FusionNet[zzd的批注]: FusionNet: Fusing via Fully-Aware Attention with Application to Machine Comprehension
- QANet[zzd的批注]: QANet: Combining Local Convolution with Global Self-Attention for Reading Comprehension
一些相关链接:
一些相关代码:
xlq小哥哥好棒~
GJR小哥哥最棒了!
想说什么都可以的辣~
太强了
围观围观,图灵测试666
请问您有用BERT来跑这个数据集嘛?效果怎么样呀?