None

Multiview Data from Gaussian MixturesΒΆ

In this example we show how to simulate multiview data from Gaussian mixtures and plot them using a crossviews plot.

[1]:
from mvlearn.datasets import GaussianMixture
from mvlearn.plotting import crossviews_plot
import numpy as np
import matplotlib.pyplot as plt
%matplotlib inline

%load_ext autoreload
%autoreload 2

Latent variables are sampled from two multivariate Gaussians with equal prior probability. Then a polynomial transformation is applied and noise is added independently to both the transformed and untransformed latents.

[2]:
n_samples = 100
centers = [[0,1], [0,-1]]
covariances = [np.eye(2), np.eye(2)]
GM = GaussianMixture(n_samples, centers, covariances, random_state=42,
                     shuffle=True, shuffle_random_state=42)
GM = GM.sample_views(transform='poly', n_noise=2)

latent,y = GM.get_Xy(latents=True)
Xs,_ = GM.get_Xy(latents=False)

The latent data is plotted against itself to reveal the underlying distribtution.

[3]:
crossviews_plot([latent, latent], labels=y, title='Latent Variable', equal_axes=True)
../../_images/tutorials_datasets_GaussianMixtures_5_0.svg

The noisy latent variable (view 1) is plotted against the transformed latent variable (view 2), an example of a dataset with two views.

[4]:
crossviews_plot(Xs, labels=y, title='View 1 vs. View 2 (Polynomial Transform + noise)', equal_axes=True)
../../_images/tutorials_datasets_GaussianMixtures_7_0.svg