learning.learners

Module: learning.learners

Inheritance diagram for selectinf.learning.learners:

digraph inheritanced9edef03ac { rankdir=LR; size="8.0, 12.0"; "learning.learners.mixture_learner" [URL="#selectinf.learning.learners.mixture_learner",fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5)",target="_top"]; "learning.learners.sparse_mixture_learner" [URL="#selectinf.learning.learners.sparse_mixture_learner",fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5)",target="_top",tooltip="Move only along one dimension at a time"]; "learning.learners.mixture_learner" -> "learning.learners.sparse_mixture_learner" [arrowsize=0.5,style="setlinewidth(0.5)"]; }

Classes

mixture_learner

class selectinf.learning.learners.mixture_learner(algorithm, observed_outcome, observed_sampler, observed_target, target_cov, cross_cov)[source]

Bases: object

__init__(algorithm, observed_outcome, observed_sampler, observed_target, target_cov, cross_cov)[source]

Learn a function

P(Y=1|T, N=S-c*T)

where N is the sufficient statistic corresponding to nuisance parameters and T is our target. The random variable Y is

Y = check_selection(algorithm(perturbed_sampler))

That is, we perturb the center of observed_sampler along a ray (or higher-dimensional affine subspace) and rerun the algorithm, checking to see if the test check_selection passes.

For full model inference, check_selection will typically check to see if a given feature is still in the selected set. For general targets, we will typically condition on the exact observed value of algorithm(observed_sampler).

Parameters

algorithm : callable

Selection algorithm that takes a noise source as its only argument.

observed_set : set(int)

The purported value algorithm(observed_sampler), i.e. run with the original seed.

feature : int

One of the elements of observed_set.

observed_sampler : normal_source

Representation of the data used in the selection procedure.

learning_proposal : callable

Proposed position of new T to add to evaluate algorithm at.

scales = [0.5, 1, 1.5, 2, 5, 10]
learning_proposal()[source]

General return value should be (data, target) where the selection algorithm takes argument data and target is the (possibly conditional) MLE of our parametric model.

proposal_density(target_val)[source]

The (conditional, given self.center) density of our draws.

Parameters

target_val : np.ndarray((-1, self.center.shape))

generate_data(B=500, check_selection=None)[source]
Parameters

B : int

How many queries?

check_selection : callable (optional)

Callable that determines selection variable.

Returns

Y : np.array((B, -1))

Binary responses for learning selection.

T : np.array((B, -1))

Points of targets where reponse evaluated - features in learning algorithm. Successive draws from self.learning_proposal.

algorithm : callable

Algorithm taking arguments of shape (T.shape[1],) – returns something of shape (Y.shape[1],).

learn(fit_probability, fit_args={}, B=500, check_selection=None, verbose=False)[source]
fit_probabilitycallable

Function to learn a probability model P(Y=1|T) based on [T, Y].

fit_argsdict

Keyword arguments to fit_probability.

Bint

How many queries?

check_selectioncallable (optional)

Callable that determines selection variable.

verbosebool

Print out probability of selection?

sparse_mixture_learner

class selectinf.learning.learners.sparse_mixture_learner(algorithm, observed_outcome, observed_sampler, observed_target, target_cov, cross_cov)[source]

Bases: selectinf.learning.learners.mixture_learner

Move only along one dimension at a time

__init__(algorithm, observed_outcome, observed_sampler, observed_target, target_cov, cross_cov)

Learn a function

P(Y=1|T, N=S-c*T)

where N is the sufficient statistic corresponding to nuisance parameters and T is our target. The random variable Y is

Y = check_selection(algorithm(perturbed_sampler))

That is, we perturb the center of observed_sampler along a ray (or higher-dimensional affine subspace) and rerun the algorithm, checking to see if the test check_selection passes.

For full model inference, check_selection will typically check to see if a given feature is still in the selected set. For general targets, we will typically condition on the exact observed value of algorithm(observed_sampler).

Parameters

algorithm : callable

Selection algorithm that takes a noise source as its only argument.

observed_set : set(int)

The purported value algorithm(observed_sampler), i.e. run with the original seed.

feature : int

One of the elements of observed_set.

observed_sampler : normal_source

Representation of the data used in the selection procedure.

learning_proposal : callable

Proposed position of new T to add to evaluate algorithm at.

learning_proposal()[source]

General return value should be (data, target) where the selection algorithm takes argument data and target is the (possibly conditional) MLE of our parametric model.

proposal_density(target_val)[source]

The (conditional, given self.center) density of our draws.

Parameters

target_val : np.ndarray((-1, self.center.shape))

generate_data(B=500, check_selection=None)
Parameters

B : int

How many queries?

check_selection : callable (optional)

Callable that determines selection variable.

Returns

Y : np.array((B, -1))

Binary responses for learning selection.

T : np.array((B, -1))

Points of targets where reponse evaluated - features in learning algorithm. Successive draws from self.learning_proposal.

algorithm : callable

Algorithm taking arguments of shape (T.shape[1],) – returns something of shape (Y.shape[1],).

learn(fit_probability, fit_args={}, B=500, check_selection=None, verbose=False)
fit_probabilitycallable

Function to learn a probability model P(Y=1|T) based on [T, Y].

fit_argsdict

Keyword arguments to fit_probability.

Bint

How many queries?

check_selectioncallable (optional)

Callable that determines selection variable.

verbosebool

Print out probability of selection?

scales = [0.5, 1, 1.5, 2, 5, 10]