Search Results for author: Sandra Zilles

Found 12 papers, 2 papers with code

Approximation Algorithms for Preference Aggregation Using CP-Nets

no code implementations14 Dec 2023 Abu Mohammmad Hammad Ali, Boting Yang, Sandra Zilles

We first analyze a trivial 2-approximation algorithm that simply outputs the best of the given input preferences, and establish a structural condition under which the approximation ratio of this algorithm is improved to $4/3$.

A Labelled Sample Compression Scheme of Size at Most Quadratic in the VC Dimension

no code implementations24 Dec 2022 Farnam Mansouri, Sandra Zilles

This paper presents a construction of a proper and stable labelled sample compression scheme of size $O(\VCD^2)$ for any finite concept class, where $\VCD$ denotes the Vapnik-Chervonenkis Dimension.

Open-Ended Question Answering

Using Sum-Product Networks to Assess Uncertainty in Deep Active Learning

no code implementations20 Jun 2022 Mohamadsadegh Khosravani, Sandra Zilles

The success of deep active learning hinges on the choice of an effective acquisition function, which ranks not yet labeled data points according to their expected informativeness.

Active Learning Image Classification +1

On the Complexity of Symbolic Finite-State Automata

no code implementations10 Nov 2020 Dana Fisman, Hadar Frenkel, Sandra Zilles

We revisit the complexity of procedures on SFAs (such as intersection, emptiness, etc.)

Optimal Collusion-Free Teaching

no code implementations10 Mar 2019 David Kirkpatrick, Hans U. Simon, Sandra Zilles

In addition to formulating an optimal model of collusion-free teaching, our main results are on the computational complexity of deciding whether $\mathrm{NCTD}^+(\mathcal{C})=k$ (or $\mathrm{NCTD}(\mathcal{C})=k$) for given $\mathcal{C}$ and $k$.

An Overview of Machine Teaching

no code implementations18 Jan 2018 Xiaojin Zhu, Adish Singla, Sandra Zilles, Anna N. Rafferty

In this paper we try to organize machine teaching as a coherent set of ideas.

The Complexity of Learning Acyclic Conditional Preference Networks

no code implementations11 Jan 2018 Eisa Alanazi, Malek Mouhoub, Sandra Zilles

To assess the optimality of learning algorithms as well as to better understand the combinatorial structure of classes of CP-nets, it is helpful to calculate certain learning-theoretic information complexity parameters.

Attribute

An Empirical Study of the Effects of Spurious Transitions on Abstraction-based Heuristics

no code implementations14 Nov 2017 Mehdi Sadeqi, Robert C. Holte, Sandra Zilles

However, the quality of abstraction-based heuristic functions, and thus the speed of search, can suffer from spurious transitions, i. e., state transitions in the abstract state space for which no corresponding transitions in the reachable component of the original state space exist.

Front-to-End Bidirectional Heuristic Search with Near-Optimal Node Expansions

1 code implementation10 Mar 2017 Jingwei Chen, Robert C. Holte, Sandra Zilles, Nathan R. Sturtevant

pairs, and present a new admissible front-to-end bidirectional heuristic search algorithm, Near-Optimal Bidirectional Search (NBS), that is guaranteed to do no more than 2VC expansions.

Preference-based Teaching

no code implementations6 Feb 2017 Zi-Yuan Gao, Christoph Ries, Hans Ulrich Simon, Sandra Zilles

We introduce a new model of teaching named "preference-based teaching" and a corresponding complexity parameter---the preference-based teaching dimension (PBTD)---representing the worst-case number of examples needed to teach any concept in a given concept class.

Interactive Learning from Multiple Noisy Labels

1 code implementation24 Jul 2016 Shankar Vembu, Sandra Zilles

Interactive learning is a process in which a machine learning algorithm is provided with meaningful, well-chosen examples as opposed to randomly chosen examples typical in standard supervised learning.

BIG-bench Machine Learning General Classification

Combining Models of Approximation with Partial Learning

no code implementations5 Jul 2015 Zi-Yuan Gao, Frank Stephan, Sandra Zilles

Here three variants of approximate learning will be introduced and investigated with respect to the question whether they can be combined with partial learning.

Object

Cannot find the paper you are looking for? You can Submit a new open access paper.