3

  1. password:19990702
  2. 未找之前
    1. 1.1
    2. 1.2
    3. 2.1
    4. 2.2
    5. 3.1
    6. 3.3
    7. 5.2
  • 找之后
  • 正式参考文献
    1. 添加

  • password:19990702

    未找之前

    1.1

    1. Jordan M I, Mitchell T M. Machine learning: Trends, perspectives, and prospects[J]. Science, 2015, 349(6245): 255-260.

    2. Burkart N, Huber M F. A survey on the explainability of supervised machine learning[J]. Journal of Artificial Intelligence Research, 2021, 70: 245-317.

    3. Tan C, Sun F, Kong T, et al. A survey on deep transfer learning[C]//Artificial Neural Networks and Machine Learning–ICANN 2018: 27th International Conference on Artificial Neural Networks, Rhodes, Greece, October 4-7, 2018, Proceedings, Part III 27. Springer International Publishing, 2018: 270-279.

    4. Incomplete multisource transfer learning

    5. Crammer K, Kearns M, Wortman J. Learning from Multiple Sources[J]. Journal of Machine Learning Research, 2008, 9: 1757-1774.

    6. Mansour Y, Mohri M, Rostamizadeh A. Domain adaptation with multiple sources[C]. Proceedings of the 21st International Conference on Neural Information Processing Systems. 2008: 1041-1048.

    7. 领域自适应研究综述

    8. Pan S J, Tsang I W, Kwok J T, et al. Domain Adaptation Via Transfer Component Analysis[J]. IEEE Transactions on Neural Networks, 2010, 22(2): 199-210.

    9. 融合知识的领域自适应方法综述

    10. Wilson G, Cook D J. A survey of unsupervised deep domain adaptation[J]. ACM Transactions on Intelligent Systems and Technology (TIST), 2020, 11(5): 1-46.

    11. 基于语言模型的预训练技术研究综述。

      1.2

    12. 基于score样本选择的同构域适应迁移学习

    13. 基于模糊规则学习的无监督异构领域自适应

    14. Heterogeneous Domain Adaptation Through Progressive Alignment

    15. 基于半监督CRF的跨领域中文分词

    16. 基于迁移学习的无监督跨域人脸表情识别

    17. 基于类内最大均值差异的无监督领域自适应算法

    18. A Survey on Negative Transfer

    19. Adversarial Multiple Source Domain Adaptation

    20. Aligning Domain-Specific Distribution and Classifier for Cross-Domain Classification from Multiple Sources

    21. 基于实例迁移的跨项目软件缺陷预测

    22. 基于多源动态TrAdaBoost的实例迁移学习方法

    23. 基于特征迁移和实例迁移的跨项目缺陷预测方法

    24. Cross-language knowledge transfer using multilingual deep neural network with shared hidden layers

    25. Unsupervised domain adaptation with residual transfer networks. In: Advances in Neural Information Processing Systems

    26. Deep hashing network for efficient similarity retrieval

    27. Mapping and Revising Markov Logic Networks for Transfer Learning

    28. Transfer Learning by Mapping with Minimal Target Data

    29. 基于跨域结构保持投影的多源异构在线迁移学习研究

    30. Rethinking ImageNet Pre-training

    31. Do Better ImageNet Models Transfer Better?

    32. A kernel two-sample test

    33. Central moment discrepancy (CMD) for domain-invariant representation learning

    34. Correlation alignment for unsupervised domain adaptation

    35. Deep Domain Confusion: Maximizing for Domain Invariance

    36. Unsupervised Domain Adaptation by Backpropagation

    37. Domain Separation Networks

      2.1

    38. A Survey on Transfer Learning

    39. 迁移学习导论王晋东

      2.2

    40. 【Wang A, Singh A, Michael J, et al. GLUE: A multi-task benchmark and analysis platform for natural language understanding[J]. arXiv preprint arXiv:1804.07461, 2018.】

    41. Devlin J, Chang M W, Lee K, et al. Bert: Pre-training of deep bidirectional transformers for language understanding[J]. arXiv preprint arXiv:1810.04805, 2018.

    42. Radford A, Narasimhan K, Salimans T, et al. Improving language understanding by generative pre-training[J]. 2018.

    43. Yang Z, Dai Z, Yang Y, et al. Xlnet: Generalized autoregressive pretraining for language understanding[J]. Advances in neural information processing systems, 2019, 32.

    44. Ding N, Qin Y, Yang G, et al. Parameter-efficient fine-tuning of large-scale pre-trained language models[J]. Nature Machine Intelligence, 2023, 5(3): 220-235.

    45. Parameter-Efficient Transfer Learning for NLP

    46. COMPACTER:Efficient Low-Rank Hypercomplex Adapter Layers

    47. Prefix-Tuning: Optimizing Continuous Prompts for Generation

    48. AdapterFusion: Non-destructive task composition for transfer learning

    49. Neural Machine Translation by Jointly Learning to Align and Translate

      3.1

    50. Biographies, Bollywood, Boom-boxes and Blenders: Domain Adaptation for Sentiment Classification

    51. Adversarial Multi-task Learning for Text Classification

    52. UDAPTER

    53. Learning word vectors for sentiment analysis.

    54. Exploiting class relationships for sentiment categorization with respect to rating scales

    55. A broad-coverage challenge corpus for sentence understanding through inference

      3.3

    56. Ganin Y, Ustinova E, Ajakan H, et al. Domain-adversarial training of neural networks[J]. Journal of machine learning research, 2016, 17(59): 1-35.

    5.2

    Ganin Y, Ustinova E, Ajakan H, et al. Domain-adversarial training of neural networks[J]. Journal of machine learning research, 2016, 17(59): 1-35.

    找之后

    1. Jordan M I, Mitchell T M. Machine learning: Trends, perspectives, and prospects[J]. Science, 2015, 349(6245): 255-260.

    2. Burkart N, Huber M F. A survey on the explainability of supervised machine learning[J]. Journal of Artificial Intelligence Research, 2021, 70: 245-317.

    3. Tan C, Sun F, Kong T, et al. A survey on deep transfer learning[C]//Artificial Neural Networks and Machine Learning–ICANN 2018: 27th International Conference on Artificial Neural Networks, Rhodes, Greece, October 4-7, 2018, Proceedings, Part III 27. Springer International Publishing, 2018: 270-279.

    4. Ding Z, Shao M, Fu Y. Incomplete multisource transfer learning[J]. IEEE transactions on neural networks and learning systems, 2016, 29(2): 310-323.

    5. Crammer K, Kearns M, Wortman J. Learning from Multiple Sources[J]. Journal of Machine Learning Research, 2008, 9: 1757-1774.

    6. Mansour Y, Mohri M, Rostamizadeh A. Domain adaptation with multiple sources[C]. Proceedings of the 21st International Conference on Neural Information Processing Systems. 2008: 1041-1048.

    7. 李晶晶, 孟利超, 张可, 等. 领域自适应研究综述[J]. 计算机工程, 2021, 47(6): 1-13.

    8. Pan S J, Tsang I W, Kwok J T, et al. Domain adaptation via transfer component analysis[J]. IEEE transactions on neural networks, 2010, 22(2): 199-210.

    9. 崔福伟, 吴璇璇, 陈钰枫, 刘健, 徐金安.融合知识的领域自适应方法综述[J]. 计算机科学, 2023, 50(8): 142-149.

    10. Wilson G, Cook D J. A survey of unsupervised deep domain adaptation[J]. ACM Transactions on Intelligent Systems and Technology (TIST), 2020, 11(5): 1-46.

    11. 岳增营, 叶霞, 刘睿珩. 基于语言模型的预训练技术研究综述[J]. 中文信息学报, 2021, 35(9): 15-29.

    12. 董莹莹, 邓万宇, 刘光达. 基于 score 样本选择的同构域适应迁移学习[J]. 计算机与数字工程, 2019, 47(12): 2989-2992, 3153.

    13. 孙武, 邓赵红, 娄琼丹, 等. 基于模糊规则学习的无监督异构领域自适应[J]. Journal of Frontiers of Computer Science & Technology, 2022, 16(2).

    14. Li J, Lu K, Huang Z, et al. Heterogeneous domain adaptation through progressive alignment[J]. IEEE transactions on neural networks and learning systems, 2018, 30(5): 1381-1391.

    15. 邓丽萍, 罗智勇. 基于半监督 CRF 的跨领域中文分词[J]. 中文信息学报, 2017, 31(4): 9-19.

    16. 莫宏伟, 傅智杰. 基于迁移学习的无监督跨域人脸表情识别[J]. 智能系统学报, 2021, 16(3): 397-406.

    17. 蔡瑞初, 李嘉豪, 郝志峰. 基于类内最大均值差异的无监督领域自适应算法[J]. Application Research of Computers/Jisuanji Yingyong Yanjiu, 2020, 37(8).

    18. Zhang W, Deng L, Zhang L, et al. A survey on negative transfer[J]. IEEE/CAA Journal of Automatica Sinica, 2022, 10(2): 305-329.

    19. Zhao H, Zhang S, Wu G, et al. Adversarial multiple source domain adaptation[J]. Advances in neural information processing systems, 2018, 31.

    20. Zhu Y, Zhuang F, Wang D. Aligning domain-specific distribution and classifier for cross-domain classification from multiple sources[C]//Proceedings of the AAAI conference on artificial intelligence. 2019, 33(01): 5989-5996.

    21. 毛发贵, 李碧雯, 沈备军. 基于实例迁移的跨项目软件缺陷预测[J]. 计算机科学与探索, 2016, 10(1): 43-55.

    22. 张倩, 李海港, 李明, 等. 基于多源动态 TrAdaBoost 的实例迁移学习方法[J]. 中国矿业大学学报, 2014, 43(4): 713-720.

    23. 倪超, 陈翔, 刘望舒, 等. 基于特征迁移和实例迁移的跨项目缺陷预测方法[J]. 软件学报, 2019, 30(5): 1308-1329.

    24. Huang J T, Li J, Yu D, et al. Cross-language knowledge transfer using multilingual deep neural network with shared hidden layers[C]//2013 IEEE international conference on acoustics, speech and signal processing. IEEE, 2013: 7304-7308.

    25. Long M, Zhu H, Wang J, et al. Unsupervised domain adaptation with residual transfer networks[J]. Advances in neural information processing systems, 2016, 29.

    26. Zhu H, Long M, Wang J, et al. Deep Hashing Network for efficient similarity retrieval[C]//Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence. 2016: 2415-2421.

    27. Mihalkova L, Huynh T, Mooney R J. Mapping and revising markov logic networks for transfer learning[C]//Aaai. 2007, 7: 608-614.

    28. Mihalkova L, Mooney R J. Transfer learning by mapping with minimal target data[C]//Proceedings of the AAAI-08 workshop on transfer learning for complex tasks. 2008: 31-36.

    29. 蒋晓玲, 吴映波, 陈蒙, 等. 基于跨域结构保持投影的异构在线多源迁移学习方法[J]. 电子学报, 2023, 51(8): 1983-1994.

    30. He K, Girshick R, Dollár P. Rethinking imagenet pre-training[C]//Proceedings of the IEEE/CVF International Conference on Computer Vision. 2019: 4918-4927.

    31. Kornblith S, Shlens J, Le Q V. Do better imagenet models transfer better?[C]//Proceedings of the IEEE/CVF conference on computer vision and pattern recognition. 2019: 2661-2671.

    32. Gretton A, Borgwardt K M, Rasch M J, et al. A kernel two-sample test[J]. The Journal of Machine Learning Research, 2012, 13(1): 723-773.

    33. Zellinger W, Lughofer E, Saminger-Platz S, et al. CENTRAL MOMENT DISCREPANCY (CMD) FOR DOMAIN-INVARIANT REPRESENTATION LEARNING[J]. stat, 2017, 1050: 4.

    34. Sun B, Feng J, Saenko K. Correlation alignment for unsupervised domain adaptation[J]. Domain adaptation in computer vision applications, 2017: 153-171.

    35. Tzeng E, Hoffman J, Zhang N, et al. Deep domain confusion: Maximizing for domain invariance[J]. arXiv preprint arXiv:1412.3474, 2014.

    36. Ganin Y, Lempitsky V. Unsupervised domain adaptation by backpropagation[C]//International conference on machine learning. PMLR, 2015: 1180-1189.

    37. Bousmalis K, Trigeorgis G, Silberman N, et al. Domain separation networks[C]//Proceedings of the 30th International Conference on Neural Information Processing Systems. 2016: 343-351.

    38. Pan S J, Yang Q. A survey on transfer learning[J]. IEEE Transactions on knowledge and data engineering, 2009, 22(10): 1345-1359.

    39. 王晋东, 陈益强. 迁移学习导论[M]. 北京: 电子工业出版社, 2021: 27.

    40. Wang A, Singh A, Michael J, et al. GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding[C]//International Conference on Learning Representations. 2018.

    41. Kenton J D M W C, Toutanova L K. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding[C]//Proceedings of NAACL-HLT. 2019: 4171-4186.

    42. Radford A, Narasimhan K, Salimans T, et al. Improving language understanding by generative pre-training[J]. 2018.

    43. Yang Z, Dai Z, Yang Y, et al. Xlnet: Generalized autoregressive pretraining for language understanding[J]. Advances in neural information processing systems, 2019, 32.

    44. Ding N, Qin Y, Yang G, et al. Parameter-efficient fine-tuning of large-scale pre-trained language models[J]. Nature Machine Intelligence, 2023, 5(3): 220-235.

    45. Houlsby N, Giurgiu A, Jastrzebski S, et al. Parameter-efficient transfer learning for NLP[C]//International Conference on Machine Learning. PMLR, 2019: 2790-2799.

    46. Karimi Mahabadi R, Henderson J, Ruder S. Compacter: Efficient low-rank hypercomplex adapter layers[J]. Advances in Neural Information Processing Systems, 2021, 34: 1022-1035.

    47. Li X L, Liang P. Prefix-Tuning: Optimizing Continuous Prompts for Generation[C]//Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). 2021: 4582-4597.

    48. Pfeiffer J, Kamath A, Rücklé A, et al. AdapterFusion: Non-destructive task composition for transfer learning[C]//16th Conference of the European Chapter of the Associationfor Computational Linguistics, EACL 2021. Association for Computational Linguistics (ACL), 2021: 487-503.

    49. Bahdanau D, Cho K, Bengio Y. Neural Machine Translation by Jointly Learning to Align and Translate[J]. 2019.

    50. Fang F, Dutta K, Datta A. Domain adaptation for sentiment classification in light of multiple sources[J]. INFORMS Journal on Computing, 2014, 26(3): 586-598.

    51. Liu P, Qiu X, Huang X J. Adversarial Multi-task Learning for Text Classification[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2017: 1-10.

    52. Malik B, Kashyap A R, Kan M Y, et al. UDAPTER-Efficient Domain Adaptation Using Adapters[C]//Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics. 2023: 2241-2255.

    53. Maas A, Daly R E, Pham P T, et al. Learning word vectors for sentiment analysis[C]//Proceedings of the 49th annual meeting of the association for computational linguistics: Human language technologies. 2011: 142-150.

    54. Pang B. Seeing stars: Exploiting class relationships for sentiment categorization with respect to rating scales[J]. Preface: General Chair, 2005: 115.

    55. Williams A, Nangia N, Bowman S R. A Broad-Coverage Challenge Corpus for Sentence Understanding through Inference[C]//Proceedings of NAACL-HLT. 2018: 1112-1122.

    56. Ganin Y, Ustinova E, Ajakan H, et al. Domain-adversarial training of neural networks[J]. Journal of machine learning research, 2016, 17(59): 1-35.

    正式参考文献

    [1] Jordan M I,Mitchell T M. Machine learning: Trends,perspectives,and prospects[J]. Science,2015,349(6245): 255-260.

    [2] Burkart N,Huber M F. A survey on the explainability of supervised machine learning[J]. Journal of Artificial Intelligence Research,2021,70: 245-317.

    [3] Tan C,Sun F,Kong T,et al. A survey on deep transfer learning[C]//Artificial Neural Networks and Machine Learning–ICANN 2018: 27th International Conference on Artificial Neural Networks,Rhodes,Greece,October 4-7,2018,Proceedings,Part III 27. Springer International Publishing,2018: 270-279.

    [4] 李晶晶,孟利超,张可,等. 领域自适应研究综述[J]. 计算机工程,2021,47(6): 1-13.

    [5] Pan S J,Tsang I W,Kwok J T,et al. Domain adaptation via transfer component analysis[J]. IEEE transactions on neural networks,2010,22(2): 199-210.

    [6] 崔福伟,吴璇璇,陈钰枫,刘健,徐金安.融合知识的领域自适应方法综述[J]. 计算机科学,2023,50(8): 142-149.

    [7] Wilson G,Cook D J. A survey of unsupervised deep domain adaptation[J]. ACM Transactions on Intelligent Systems and Technology (TIST),2020,11(5): 1-46.

    [8] Ding Z,Shao M,Fu Y. Incomplete multisource transfer learning[J]. IEEE transactions on neural networks and learning systems,2016,29(2): 310-323.

    [9] Crammer K,Kearns M,Wortman J. Learning from Multiple Sources[J]. Journal of Machine Learning Research,2008,9: 1757-1774.

    [10] Mansour Y,Mohri M,Rostamizadeh A. Domain adaptation with multiple sources[C]. Proceedings of the 21st International Conference on Neural Information Processing Systems. 2008: 1041-1048.

    [11] 岳增营,叶霞,刘睿珩. 基于语言模型的预训练技术研究综述[J]. 中文信息学报,2021,35(9): 15-29.

    [12] 董莹莹,邓万宇,刘光达. 基于 score 样本选择的同构域适应迁移学习[J]. 计算机与数字工程,2019,47(12): 2989-2992,3153.

    [13] 孙武,邓赵红,娄琼丹,顾鑫,王士同.基于模糊规则学习的无监督异构领域自适应[J].计算机科学与探索,2022,16(2):403-412

    [14] Li J,Lu K,Huang Z,et al. Heterogeneous domain adaptation through progressive alignment[J]. IEEE transactions on neural networks and learning systems,2018,30(5): 1381-1391.

    [15] 邓丽萍,罗智勇. 基于半监督 CRF 的跨领域中文分词[J]. 中文信息学报,2017,31(4): 9-19.

    [16] 莫宏伟,傅智杰. 基于迁移学习的无监督跨域人脸表情识别[J]. 智能系统学报,2021,16(3): 397-406.

    [17] 蔡瑞初,李嘉豪,郝志峰.基于类内最大均值差异的无监督领域自适应算法[J].计算机应用研究,2020,37(8):2371-2375

    [18] Zhang W,Deng L,Zhang L,et al. A survey on negative transfer[J]. IEEE/CAA Journal of Automatica Sinica,2022,10(2): 305-329.

    [19] Zhao H,Zhang S,Wu G,et al. Adversarial multiple source domain adaptation[J]. Advances in neural information processing systems,2018,31.

    [20] Zhu Y,Zhuang F,Wang D. Aligning domain-specific distribution and classifier for cross-domain classification from multiple sources[C]//Proceedings of the AAAI conference on artificial intelligence. 2019,33(01): 5989-5996.

    [21] 毛发贵,李碧雯,沈备军. 基于实例迁移的跨项目软件缺陷预测[J]. 计算机科学与探索,2016,10(1): 43-55.

    [22] 张倩,李海港,李明,等. 基于多源动态 TrAdaBoost 的实例迁移学习方法[J]. 中国矿业大学学报,2014,43(4): 713-720.

    [23] 倪超,陈翔,刘望舒,等. 基于特征迁移和实例迁移的跨项目缺陷预测方法[J]. 软件学报,2019,30(5): 1308-1329.

    [24] Huang J T,Li J,Yu D,et al. Cross-language knowledge transfer using multilingual deep neural network with shared hidden layers[C]//2013 IEEE international conference on acoustics,speech and signal processing. IEEE,2013: 7304-7308.

    [25] Long M,Zhu H,Wang J,et al. Unsupervised domain adaptation with residual transfer networks[J]. Advances in neural information processing systems,2016,29.

    [26] Zhu H,Long M,Wang J,et al. Deep Hashing Network for efficient similarity retrieval[C]//Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence. 2016: 2415-2421.

    [27] Mihalkova L,Huynh T,Mooney R J. Mapping and revising markov logic networks for transfer learning[C]//Aaai. 2007,7: 608-614.

    [28] Mihalkova L,Mooney R J. Transfer learning by mapping with minimal target data[C]//Proceedings of the AAAI-08 workshop on transfer learning for complex tasks. 2008: 31-36.

    [29] 蒋晓玲,吴映波,陈蒙,等. 基于跨域结构保持投影的异构在线多源迁移学习方法[J]. 电子学报,2023,51(8): 1983-1994.

    [30] He K,Girshick R,Dollár P. Rethinking imagenet pre-training[C]//Proceedings of the IEEE/CVF International Conference on Computer Vision. 2019: 4918-4927.

    [31] Kornblith S,Shlens J,Le Q V. Do better imagenet models transfer better?[C]//Proceedings of the IEEE/CVF conference on computer vision and pattern recognition. 2019: 2661-2671.

    [32] Gretton A,Borgwardt K M,Rasch M J,et al. A kernel two-sample test[J]. The Journal of Machine Learning Research,2012,13(1): 723-773.

    [33] Zellinger W,Lughofer E,Saminger-Platz S,et al. CENTRAL MOMENT DISCREPANCY (CMD) FOR DOMAIN-INVARIANT REPRESENTATION LEARNING[J]. stat,2017,1050: 4.

    [34] Sun B,Feng J,Saenko K. Correlation alignment for unsupervised domain adaptation[J]. Domain adaptation in computer vision applications,2017: 153-171.

    [35] Long M, Cao Y, Wang J, et al. Learning transferable features with deep adaptation networks[C]//International conference on machine learning. PMLR, 2015: 97-105.

    [36] Ganin Y,Lempitsky V. Unsupervised domain adaptation by backpropagation[C]//International conference on machine learning. PMLR,2015: 1180-1189.

    [37] Bousmalis K,Trigeorgis G,Silberman N,et al. Domain separation networks[C]//Proceedings of the 30th International Conference on Neural Information Processing Systems. 2016: 343-351.

    [38] Pan S J,Yang Q. A survey on transfer learning[J]. IEEE Transactions on knowledge and data engineering,2009,22(10): 1345-1359.

    [39] 王晋东,陈益强. 迁移学习导论[M]. 北京: 电子工业出版社,2021: 27.

    [40] Wang A,Singh A,Michael J,et al. GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding[C]//International Conference on Learning Representations. 2018.

    [41] Kenton J D M W C,Toutanova L K. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding[C]//Proceedings of NAACL-HLT. 2019: 4171-4186.

    [42] Radford A,Narasimhan K,Salimans T,et al. Improving language understanding by generative pre-training[J]. 2018.

    [43] Yang Z,Dai Z,Yang Y,et al. Xlnet: Generalized autoregressive pretraining for language understanding[J]. Advances in neural information processing systems,2019,32.

    [44] Vaswani A, Shazeer N, Parmar N, et al. Attention is all you need[J]. Advances in neural information processing systems, 2017, 30.

    [45] Ding N,Qin Y,Yang G,et al. Parameter-efficient fine-tuning of large-scale pre-trained language models[J]. Nature Machine Intelligence,2023,5(3): 220-235.

    [46] Houlsby N,Giurgiu A,Jastrzebski S,et al. Parameter-efficient transfer learning for NLP[C]//International Conference on Machine Learning. PMLR,2019: 2790-2799.

    [47] Karimi Mahabadi R,Henderson J,Ruder S. Compacter: Efficient low-rank hypercomplex adapter layers[J]. Advances in Neural Information Processing Systems,2021,34: 1022-1035.

    [48] Li X L,Liang P. Prefix-Tuning: Optimizing Continuous Prompts for Generation[C]//Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). 2021: 4582-4597.

    [49] Pfeiffer J,Kamath A,Rücklé A,et al. AdapterFusion: Non-destructive task composition for transfer learning[C]//16th Conference of the European Chapter of the Associationfor Computational Linguistics,EACL 2021. Association for Computational Linguistics (ACL),2021: 487-503.

    [50] Bahdanau D,Cho K,Bengio Y. Neural Machine Translation by Jointly Learning to Align and Translate[J]. 2019.

    [51] Fang F,Dutta K,Datta A. Domain adaptation for sentiment classification in light of multiple sources[J]. INFORMS Journal on Computing,2014,26(3): 586-598.

    [52] Liu P,Qiu X,Huang X J. Adversarial Multi-task Learning for Text Classification[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2017: 1-10.

    [53] Malik B,Kashyap A R,Kan M Y,et al. UDAPTER-Efficient Domain Adaptation Using Adapters[C]//Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics. 2023: 2241-2255.

    [54] Maas A,Daly R E,Pham P T,et al. Learning word vectors for sentiment analysis[C]//Proceedings of the 49th annual meeting of the association for computational linguistics: Human language technologies. 2011: 142-150.

    [55] Pang B. Seeing stars: Exploiting class relationships for sentiment categorization with respect to rating scales[J]. Preface: General Chair,2005: 115.

    [56] Williams A,Nangia N,Bowman S R. A Broad-Coverage Challenge Corpus for Sentence Understanding through Inference[C]//Proceedings of NAACL-HLT. 2018: 1112-1122.

    [57] Ganin Y,Ustinova E,Ajakan H,et al. Domain-adversarial training of neural networks[J]. Journal of machine learning research,2016,17(59): 1-35.

    添加

    杨文阳,孔科迪.基于ERNIE-BiLSTM的社交网络文本情感分析[J].中国电子科学研究院学报,2023,18(04):321-327.
    周毅勇.电子商务产品评论情感分析模型的研究与构建[J].网络安全技术与应用,2024,(01):50-53.
    李爱黎,张子帅,林荫等.基于社交网络大数据的民众情感监测研究[J].大数据,2022,8(06):105-126.
    Liu X, Dong Z, Zhang P. Tackling Data Bias in MUSIC-AVQA: Crafting a Balanced Dataset for Unbiased Question-Answering[C]//Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision. 2024: 4478-4487.
    陈晨,朱晴晴,严睿等.基于深度学习的开放领域对话系统研究综述[J].计算机学报,2019,42(07):1439-1466.
    Zhang T, Ladhak F, Durmus E, et al. Benchmarking large language models for news summarization[J]. Transactions of the Association for Computational Linguistics, 2024, 12: 39-57.


    欢迎在评论区中进行批评指正,转载请注明来源,如涉及侵权,请联系作者删除。

    ×

    喜欢就点赞,疼爱就打赏