pyrieef.learning.timeseries package

Submodules

pyrieef.learning.timeseries.functions module

pyrieef.learning.timeseries.functions.colvec(x)
pyrieef.learning.timeseries.functions.condition_gaussian(Mu, Sigma, sample, input, output)
pyrieef.learning.timeseries.functions.eigs(X)

Sorted eigenvalues and eigenvectors

pyrieef.learning.timeseries.functions.gaussian_moment_matching(mus, sigmas, h)
Parameters
  • mu – [np.array([nb_states, nb_timestep, nb_dim])] or [np.array([nb_states, nb_dim])]

  • sigma – [np.array([nb_states, nb_timestep, nb_dim, nb_dim])] or [np.array([nb_states, nb_dim, nb_dim])]

  • h – [np.array([nb_timestep, nb_states])]

Returns

pyrieef.learning.timeseries.functions.get_canonical_system(n_vars, dt)

Create a n_vars discrete canonical system with time step dt.

pyrieef.learning.timeseries.functions.get_dynamical_feature_matrix(n_varspos, n_derivs, n_data, n_samples, dt)
Get the dynamical feature matrix that

extracts n_derivs dynamical features from

a n_varspos*n_data*n_samples vector of data points using dt as time discritization.

Output: (PHI1,PHI,T1,T) o PHI1: Dynamical feature matrix for one sample o PHI : Dynamical feature matrix for n_samples

pyrieef.learning.timeseries.functions.get_state_prediction_matrix(A, B, Np, **kwargs)
Returns matrix to be used for batch prediction of the state

of the discrete system

x_k+1 = A*x_k + B*u_k

pyrieef.learning.timeseries.functions.limit_gains(gains, gain_limit)
Parameters

gains – [np.array]

:param gain_limit [float]

Returns

pyrieef.learning.timeseries.functions.mul(X)

Multiply an array of matrices

pyrieef.learning.timeseries.functions.multi_variate_normal(x, mu, sigma=None, log=True, gmm=False, lmbda=None)

Multivariatve normal distribution PDF

Parameters
  • x – np.array([nb_samples, nb_dim])

  • mu – np.array([nb_dim])

  • sigma – np.array([nb_dim, nb_dim])

  • log – bool

Returns

pyrieef.learning.timeseries.functions.multi_variate_normal_old(x, mean, covar)

Multi-variate normal distribution

x: [n_data x n_vars] matrix of data_points for which to evaluate mean: [n_vars] vector representing the mean of the distribution covar: [n_vars x n_vars] matrix representing the covariance

of the distribution

pyrieef.learning.timeseries.functions.multi_variate_t(x, nu, mu, sigma=None, log=True, gmm=False, lmbda=None)

Multivariatve T-distribution PDF https://en.wikipedia.org/wiki/Multivariate_t-distribution

Parameters
  • x – np.array([nb_samples, nb_dim])

  • mu – np.array([nb_dim])

  • sigma – np.array([nb_dim, nb_dim])

  • log – bool

Returns

pyrieef.learning.timeseries.functions.mvn_pdf(x, mu, sigma_chol, lmbda, sigma=None, reg=None)
Parameters
  • x – np.array([nb_dim x nb_samples]) samples

  • mu – np.array([nb_states x nb_dim]) mean vector

  • sigma_chol – np.array([nb_states x nb_dim x nb_dim]) cholesky decomposition of covariance matrices

  • lmbda – np.array([nb_states x nb_dim x nb_dim]) precision matrices

Returns

np.array([nb_states x nb_samples]) log mvn

pyrieef.learning.timeseries.functions.prod_gaussian(mu_1, sigma_1, mu_2, sigma_2)
pyrieef.learning.timeseries.functions.rowvec(x)
pyrieef.learning.timeseries.functions.spline(x, Y, xx, kind='cubic')

Attempts to imitate the matlab version of spline

pyrieef.learning.timeseries.gmm module

pyrieef.learning.timeseries.gmr module

Created on Thu Jul 09 18:43:10 2015

@author: Emmanuel Pignat

class pyrieef.learning.timeseries.gmr.GMR(gmm, use_slice=False, use_pybdlib_format=False)

Bases: object

Gaussian Mixture Regression

get_pdf(i, sample, has_changed=True, reg=1e-09)
get_pdf_un(i, sample, has_changed=True)
predict(sample, input, output, variance_type='full', sigma_input=None)
if X.ndim == 1:

X = X[:, np.newaxis]

if X.shape[0] < self.gmm.n_components:
raise ValueError(

‘GMM estimation with %s components, but got only %s samples’ % (self.gmm.n_components, X.shape[0]))

predict_GMM(sample, input, output, variance_type='v', predict=False, norm=False, reg=1e-09)
predict_histogramm(sample, input, output, variance_type='full')
if X.ndim == 1:

X = X[:, np.newaxis]

if X.shape[0] < self.gmm.n_components:
raise ValueError(

‘GMM estimation with %s components, but got only %s samples’ % (self.gmm.n_components, X.shape[0]))

predict_local(sample, input, output, variance_type='full', reg=1e-09)

Gaussian mixture regression with different input for each states

Parameters
  • sample

    np.array((nb_states,)) or

    np.array((nb_input_dim, nb_states))

    Data observed in input dimension for each states

  • input – list i.e. : [0] or [0,1] List of input dimension

:param output:list i.e.[0] or [0,1]

List of output dimension

Parameters

variance_type

Returns

pyrieef.learning.timeseries.hmm module

pyrieef.learning.timeseries.model module

pyrieef.learning.timeseries.plot module

pyrieef.learning.timeseries.plot.periodic_clip(val, n_min, n_max)

keeps val within the range [n_min, n_max) by assuming that val is a periodic value

pyrieef.learning.timeseries.plot.plot_TP(TP, color=[0, 1, 0], alpha=1, scale=0.2)
pyrieef.learning.timeseries.plot.plot_coordinate_system(A, b, scale=1.0, equal=True, ax=None, **kwargs)
Parameters
  • A – nb_dim x nb_dim Rotation matrix

  • b – nb_dim Translation

  • scale – float Scaling of the axis

  • equal – bool Set matplotlib axis to equal

  • ax – plt.axes()

  • kwargs

Returns

pyrieef.learning.timeseries.plot.plot_data(data, dim=[[0, 1]], figsize=(3, 3), fig=None)
pyrieef.learning.timeseries.plot.plot_distpatch(ax, x, mean, var, color=[1, 0, 0], num_std=2, alpha=0.5, linewidth=1, linealpha=1)

Function plots the mean and corresponding variance onto the specified axis

ax : axis object where the distribution patch should be plotted X : nbpoints array of x-axis values Mu : nbpoints array of mean values corresponding to the x-axis values Var: nbpoints array of variance values corresponding to the x-axis values

Author: Martijn Zeestrate, 2015

pyrieef.learning.timeseries.plot.plot_dynamic_system(f, nb_sub=10, ax=None, xlim=[-1, 1], ylim=[-1, 1], scale=0.01, name=None, equal=False, **kwargs)

Plot a dynamical system dx = f(x) :param f: a function that takes as input x as [N,2] and return dx [N, 2] :param nb_sub: :param ax0: :param xlim: :param ylim: :param scale: :param kwargs: :return:

pyrieef.learning.timeseries.plot.plot_function_map(f, nb_sub=10, ax=None, xlim=[-1, 1], ylim=[-1, 1], opp=False, exp=False)
Parameters
  • f – [function] A function to plot that can take an array((N, nb_dim)) as input

  • nb_sub

  • ax0

  • xlim

  • ylim

Returns

pyrieef.learning.timeseries.plot.plot_gauss3d(ax, mean, covar, n_points=30, n_rings=20, color='red', alpha=0.3, linewidth=0)

Plot 3d Gaussian

pyrieef.learning.timeseries.plot.plot_gaussian(mu, sigma, dim=None, color='r', alpha=0.5, lw=1, markersize=6, ax=None, plots=None, nb_segm=24, **kwargs)
pyrieef.learning.timeseries.plot.plot_gaussian1d(Mu, Sigma, lim=[-1, 1], color=[1, 0, 0], alpha=0.5, lw=1, ax=None, nb=1, nb_step=100)
pyrieef.learning.timeseries.plot.plot_gmm(Mu, Sigma, dim=None, color=[1, 0, 0], alpha=0.5, linewidth=1, markersize=6, ax=None, empty=False, edgecolor=None, edgealpha=None, priors=None, border=False, nb=1, swap=True, center=True)

This function displays the parameters of a Gaussian Mixture Model (GMM).

Inputs —————————————————————–

o Mu: D x K array representing the centers of K Gaussians. o Sigma: D x D x K array representing the covariance matrices of K Gaussians.

Author: Martijn Zeestraten, 2015

http://programming-by-demonstration.org/martijnzeestraten

Note- Daniel Berio, switched matrix layout to be consistent with pbdlib matlab,

probably breaks with gmm now.

pyrieef.learning.timeseries.plot.plot_gmm3d(ax, means, covars, n_points=20, n_rings=15, color='red', alpha=0.4, linewidth=0)

Plot 3D gmm

pyrieef.learning.timeseries.plot.plot_linear_system(K, b=None, name=None, nb_sub=10, ax0=None, xlim=[-1, 1], ylim=[-1, 1], equal=True, scale=0.01, scale_K=100, plot_gains=True, field=None, multi_center=False, **kwargs)
pyrieef.learning.timeseries.plot.plot_mixture_linear_system(model, mode='glob', nb_sub=20, gmm=True, min_alpha=0.0, cmap=<matplotlib.colors.LinearSegmentedColormap object>, A=None, b=None, gmr=False, return_strm=False, **kwargs)
Parameters
  • model

  • mode – in [‘glob’, ‘glob_overlay’, ‘local’]

  • nb_sub

  • min_alpha

  • cmap

  • kwargs

Returns

pyrieef.learning.timeseries.plot.plot_spherical_gmm(Mu, Sigma, dim=None, tp=None, color='r', alpha=255, swap=False)
Parameters
  • Mu

  • Sigma

  • dim

  • tp

  • color – Tuple (R, G, B) 0-255, Tuple (R, G, B, A), np.array((Nx3))

  • alpha

Returns

pyrieef.learning.timeseries.plot.plot_trajdist(td, ix=0, iy=1, covScale=1, color=[1, 0, 0], alpha=0.1, linewidth=0.1)

Plot 2D representation of a trajectory distribution

pyrieef.learning.timeseries.plot.plot_trajreference(meanQ, covarQ, n_vars, q, ax=[], colormap=[])

Plot Reference of GMM-based Trajectory Distributions.

pyrieef.learning.timeseries.plot.plot_y_gaussian(x, mu, sigma, dim=0, alpha=1.0, alpha_fill=None, color='r', lw=1.0, ax=None)
Parameters
  • mu – [n_states]

  • mu – [n_states, n_dim]

  • sigma – [n_states, n_dim, n_dim]

  • dim

Returns

pyrieef.learning.timeseries.plot.tri_elipsoid(n_rings, n_points)

Compute the set of triangles that covers a full elipsoid of n_rings with n_points per ring

Module contents