TPTU: Task planning and tool usage of large language model-based AI agents. arXiv:2308.03427,2023 68. Patil S G, Zhang T, Wang X, Gonzalez J E. Gorilla: Large language model connected with massive apis. arXiv preprint arXiv:2305.15334, 2023 69. Li M, Song F, Yu B, Yu H, Li Z,...
[2] Towards Reasoning in Large Language Models: A Survey 2022 [3] A Survey of Deep Learning for Mathematical Reasoning 2022 [4] Template Filling for Controllable Commonsense Reasoning 2021 [5] Large Language Models are Zero-Shot Reasoners NeurIPS 2022 [6] Maieutic Prompting: Logically Consistent...
SEMI-structured interviewsPROBABILISTIC databasesIn the current era of artificial intelligence, large language models such as ChatGPT and BARD are being increasingly used for various applications, such as language translation, text generation, and human-like conversation. The fact tha...
Program Synthesis with Large Language Models. Language Models are Few-Shot Learners Sparks of Artificial General Intelligence: Early experiments with GPT-4 Accelerating Large Language Model Decoding with Speculative Sampling FrugalGPT: How to Use Large Language Models While Reducing Cost and Improving Perf...
CODEIE: Large Code Generation Models are Better Few-Shot Information Extractors ACL 2023-07 GitHub CodeKGC: Code Language Model for Generative Knowledge Graph Construction ACM TALLIP 2024-03 GitHub Information Extraction Techniques A taxonomy by techniques. Supervised Fine-tuning PaperVenueDateCode Rethink...
Keywords: BERT model; model fine-tuning; performance optimization 1. Introduction Pre-trained large language models (LLMs) are models that are trained using a large amount of text data from our daily life, allowing the models to learn the occurrence probabilities of words or characters within thi...
there are some poor initial guesses, whereas there are none when the model has prior information. However, at the limit, the models converge to the same NMA. The GPT-3.5 model plots have a very limited number of data points, primarily because of its inability to output messages in the cor...
There are also several existing works [27, 44, 57] that incorporate attributes for controlled text generation, but these are concentrated on very different tasks like style transfer. Typically, these methods necessitate explicit provision of attributes. Differently, we introduce a semi-automated strateg...
Because language data have a central role in all areas of psychology, this new technology has the potential to transform the field. In this Perspective, we review the foundations of LLMs. We then explain how the way that LLMs are constructed enables them to effectively generate human-like ...
For queries like "Find me a highly rated camera for wildlife photography compatible with my Nikon F-Mount lenses", existing methods may generate expansions that are semantically similar but structurally unrelated to user intents. To handle such semi-structured queries with both textual and relational...