Folgen
Maxwell Nye
Maxwell Nye
adept.ai
Bestätigte E-Mail-Adresse bei alum.mit.edu - Startseite
Titel
Zitiert von
Zitiert von
Jahr
Program synthesis with large language models
J Austin, A Odena, M Nye, M Bosma, H Michalewski, D Dohan, E Jiang, ...
arXiv preprint arXiv:2108.07732, 2021
8672021
Show your work: Scratchpads for intermediate computation with language models
M Nye, AJ Andreassen, G Gur-Ari, H Michalewski, J Austin, D Bieber, ...
arXiv preprint arXiv:2112.00114, 2021
4472021
DreamCoder: growing generalizable, interpretable knowledge with wake–sleep Bayesian program learning
K Ellis, L Wong, M Nye, M Sable-Meyer, L Cary, L Anaya Pozo, L Hewitt, ...
Philosophical Transactions of the Royal Society A 381 (2251), 20220050, 2023
2042023
Dreamcoder: Bootstrapping inductive program synthesis with wake-sleep library learning
K Ellis, C Wong, M Nye, M Sablé-Meyer, L Morales, L Hewitt, L Cary, ...
Proceedings of the 42nd acm sigplan international conference on programming …, 2021
1722021
Write, execute, assess: Program synthesis with a repl
K Ellis, M Nye, Y Pu, F Sosa, J Tenenbaum, A Solar-Lezama
Advances in Neural Information Processing Systems 32, 2019
1552019
Implicit representations of meaning in neural language models
BZ Li, M Nye, J Andreas
arXiv preprint arXiv:2106.00737, 2021
1262021
Learning to infer program sketches
M Nye, L Hewitt, J Tenenbaum, A Solar-Lezama
International Conference on Machine Learning, 4861-4870, 2019
1202019
Learning compositional rules via neural program synthesis
M Nye, A Solar-Lezama, J Tenenbaum, BM Lake
Advances in Neural Information Processing Systems 33, 10832-10842, 2020
1112020
Improving coherence and consistency in neural sequence models with dual-system, neuro-symbolic reasoning
M Nye, M Tessler, J Tenenbaum, BM Lake
Advances in Neural Information Processing Systems 34, 25192-25204, 2021
932021
The variational homoencoder: Learning to learn high capacity generative models from few examples
LB Hewitt, MI Nye, A Gane, T Jaakkola, JB Tenenbaum
arXiv preprint arXiv:1807.08919, 2018
732018
Communicating natural programs to humans and machines
S Acquaviva, Y Pu, M Kryven, T Sechopoulos, C Wong, G Ecanow, M Nye, ...
Advances in Neural Information Processing Systems 35, 3731-3743, 2022
472022
Program synthesis with large language models. CoRR abs/2108.07732 (2021)
J Austin, A Odena, MI Nye, M Bosma, H Michalewski, D Dohan, E Jiang, ...
arXiv preprint arXiv:2108.07732, 2021
392021
Introducing our multimodal models, 2023
R Bavishi, E Elsen, C Hawthorne, M Nye, A Odena, A Somani, S Tasırlar
URL https://www. adept. ai/blog/fuyu-8b 2, 0
34
Show your work: Scratchpads for intermediate computation with language models, 2021
M Nye, AJ Andreassen, G Gur-Ari, H Michalewski, J Austin, D Bieber, ...
URL https://arxiv. org/abs/2112.00114, 2021
332021
Representing partial programs with blended abstract semantics
M Nye, Y Pu, M Bowers, J Andreas, JB Tenenbaum, A Solar-Lezama
arXiv preprint arXiv:2012.12964, 2020
252020
A large-scale benchmark for few-shot program induction and synthesis
F Alet, J Lopez-Contreras, J Koppel, M Nye, A Solar-Lezama, ...
International Conference on Machine Learning, 175-186, 2021
212021
Are efficient deep representations learnable?
M Nye, A Saxe
arXiv preprint arXiv:1807.06399, 2018
212018
Program synthesis with large language models (2021)
J Austin, A Odena, M Nye, M Bosma, H Michalewski, D Dohan, E Jiang, ...
arXiv preprint arXiv:2108.07732, 2021
132021
Language modeling with latent situations
BZ Li, M Nye, J Andreas
arXiv preprint arXiv:2212.10012, 2022
82022
Larc: Language annotated abstraction and reasoning corpus
S Acquaviva, Y Pu, M Nye, C Wong, MH Tessler, J Tenenbaum
Proceedings of the Annual Meeting of the Cognitive Science Society 43 (43), 2021
42021
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–20