# Math

268 papers with code • 0 benchmarks • 0 datasets

## Benchmarks

These leaderboards are used to track progress in Math
## Libraries

Use these libraries to find Math models and implementations## Most implemented papers

# Chain-of-Thought Prompting Elicits Reasoning in Large Language Models

We explore how generating a chain of thought -- a series of intermediate reasoning steps -- significantly improves the ability of large language models to perform complex reasoning.

# GPT-4 Technical Report

We report the development of GPT-4, a large-scale, multimodal model which can accept image and text inputs and produce text outputs.

# The Matrix Calculus You Need For Deep Learning

This paper is an attempt to explain all the matrix calculus you need in order to understand the training of deep neural networks.

# Full Page Handwriting Recognition via Image to Sequence Extraction

We present a Neural Network based Handwritten Text Recognition (HTR) model architecture that can be trained to recognize full pages of handwritten or printed text without image segmentation.

# PaLM: Scaling Language Modeling with Pathways

To further our understanding of the impact of scale on few-shot learning, we trained a 540-billion parameter, densely activated, Transformer language model, which we call Pathways Language Model PaLM.

# Mistral 7B

We introduce Mistral 7B v0. 1, a 7-billion-parameter language model engineered for superior performance and efficiency.

# Measuring Mathematical Problem Solving With the MATH Dataset

To facilitate future research and increase accuracy on MATH, we also contribute a large auxiliary pretraining dataset which helps teach models the fundamentals of mathematics.

# How is ChatGPT's behavior changing over time?

We find that the performance and behavior of both GPT-3. 5 and GPT-4 can vary greatly over time.

# Llemma: An Open Language Model For Mathematics

We present Llemma, a large language model for mathematics.

# Enhancing the Transformer with Explicit Relational Encoding for Math Problem Solving

We incorporate Tensor-Product Representations within the Transformer in order to better support the explicit representation of relation structure.