Linlin's blog
Blog | Linlin

Remote jupyterlab without SSH and sudo

Disclaimer: this guideline is only suggested for servers within secure local connections, e.g. within an institution or corporation’s network.

SSH port forwarding is a common way of connecting to remote jupyter notebooks. This typically takes three steps: run jupyter on the server, ssh tunneling to the jupyter instance, and then type the localhost link to your browser. That actually doesn’t sound satisfying, and it could be simpler. In this post, I’ll guide you through setting up a remote jupyterlab workspace for Python3 from scratch. Since you want to set remote notebooks, I’ll assume you feel comfortable with command lines and remote editing.

Check your python version on server

As of 2020, Python3 is strongly recommended.

If you are using RedHat Enterprise 7, the system-wide default version is Python2.7, but your system administrator usually should have installed python3. Suppose python3.6 is installed, you can enable python3 by scl enable rh-python36 bash.

If you are using Debian or Ubuntu, python3 comes with the system. In case you want to make python3 as default, add the following line to your .bashrc file: alias python=python3

Manage python environments

Working with python virtual environments is...

Bayesian basics II - Inference for univariate Gaussian, Maximum a Posteriori vs Maximum likelihood

In an earlier post, we get to know the concept of Bayesian reasoning. In this post we show Bayesian way of inferring basic statistics and briefly compare the Maximum a Posteriori to Maximum likelihood.

As a simple example of Bayesian inference in action, we estimate the expectation \(\mu\) of univariate Gaussian with known variance \(\sigma^2\). Assuming \(N\) observations as \(X=(x_1,\cdots, x_N)\), maximum likelihood estimate gives \(\mu=\sum_kx_k/N=\bar X\), of which the calculation details are neglected. Now we focus on Bayesian estimate.

Why can cross entropy be loss function?

This post is an expansion of my answer on Cross Validated

Intuitively we can take Kullbach-Leibler(KL) divergence which quantifies the distance between two distributions as the error function, but why the cross entropy arises for classification problems? For answering it, let us first recall that entropy is used to measure the uncertainty of a system, which is defined as