Research Post

Optimal Collusion-Free Teaching

Abstract

Formal models of learning from teachers need to respect certain criteria to avoid collusion. The most commonly accepted notion of collusion-freeness was proposed by Goldman and Mathias (1996), and various teaching models obeying their criterion have been studied. For each model M and each concept class C, a parameter M-TD(C) refers to the teaching dimension of concept class C in model M—defined to be the number of examples required for teaching a concept, in the worst case over all concepts in C. This paper introduces a new model of teaching, called no-clash teaching, together with the corresponding parameter NCTD(C). No-clash teaching is provably optimal in the strong sense that, given any concept class C and any model M obeying Goldman and Mathias’s collusion-freeness criterion, one obtains NCTD(C) ≤ M-TD(C). We also study a corresponding notion NCTD+ for the case of learning from positive data only, establish useful bounds on NCTD and NCTD+, and discuss relations of these parameters to the VC-dimension and to sample compression. In addition to formulating an optimal model of collusion-free teaching, our main results are on the computational complexity of deciding whether NCTD+(C) = k (or NCTD(C) = k) for given C and k. We show some such decision problems to be equivalent to the existence question for certain constrained matchings in bipartite graphs. Our NP-hardness results for the latter are of independent interest in the study of constrained graph matchings.

Latest Research Papers

Connect with the community

Get involved in Alberta's growing AI ecosystem! Speaker, sponsorship, and letter of support requests welcome.

Explore training and advanced education

Curious about study options under one of our researchers? Want more information on training opportunities?

Harness the potential of artificial intelligence

Let us know about your goals and challenges for AI adoption in your business. Our Investments & Partnerships team will be in touch shortly!