Search Results for author: Taisuke Sato

Found 8 papers, 0 papers with code

Towards end-to-end ASP computation

no code implementations12 Jun 2023 Taisuke Sato, Akihiro Takemura, Katsumi Inoue

We propose an end-to-end approach for answer set programming (ASP) and linear algebraically compute stable models satisfying given constraints.

MatSat: a matrix-based differentiable SAT solver

no code implementations14 Aug 2021 Taisuke Sato, Ryosuke Kojima

We propose a new approach to SAT solving which solves SAT problems in vector spaces as a cost minimization problem of a non-negative differentiable cost function J^sat.

A tensorized logic programming language for large-scale data

no code implementations20 Jan 2019 Ryosuke Kojima, Taisuke Sato

To embody this programming language, we also introduce a new semantics, termed tensorized semantics, which combines the traditional least model semantics in logic programming with the embeddings of tensors.

Knowledge Graphs

Partial Evaluation of Logic Programs in Vector Spaces

no code implementations28 Nov 2018 Chiaki Sakama, Hien D. Nguyen, Taisuke Sato, Katsumi Inoue

In this paper, we introduce methods of encoding propositional logic programs in vector spaces.

Embedding Tarskian Semantics in Vector Spaces

no code implementations9 Mar 2017 Taisuke Sato

We propose a new linear algebraic approach to the computation of Tarskian semantics in logic.

A Linear Algebraic Approach to Datalog Evaluation

no code implementations30 Jul 2016 Taisuke Sato

Given a linear Datalog program DB written using N constants and binary predicates, we first translate if-and-only-if completions of clauses in DB into a set Eq(DB) of matrix equations with a non-linear operation where relations in M_DB, the least Herbrand model of DB, are encoded as adjacency matrices.

A Logic-based Approach to Generatively Defined Discriminative Modeling

no code implementations15 Oct 2014 Taisuke Sato, Keiichi Kubota, Yoshitaka Kameya

Our intension is first to provide a unified approach to CRFs for complex modeling through the use of a Turing complete language and second to offer a convenient way of realizing generative-discriminative pairs in machine learning to compare generative and discriminative models and choose the best model.

Viterbi training in PRISM

no code implementations22 Mar 2013 Taisuke Sato, Keiichi Kubota

Third since VT always deals with a single probability of a single explanation, Viterbi explanation, the exclusiveness condition that is imposed on PRISM programs is no more required if we learn parameters by VT. Last but not least we can say that as VT in PRISM is general and applicable to any PRISM program, it largely reduces the need for the user to develop a specific VT algorithm for a specific model.

Cannot find the paper you are looking for? You can Submit a new open access paper.