ss

ss ss is a wonderful tool to monitor/vew connections on and to your machine (server). Just running ss shows all connections. You may filter it by socket type (tcp, udp, unix, raw etc) and on each connections state. To show current listening sockets use $ ss -l If you use ss with watch you’ll get a real time updated list of connections. For exampel, here we show only Ipv4 tcp-connections (the -t4 switch) that are currently connected (state connected);

pipenv

pipenv is a tool like npm - it creates, per project, an isolated virtual env. Pipenv is a tool that aims to bring the best of all packaging worlds (bundler, composer, npm, cargo, yarn, etc.) to the Python world. Windows is a first–class citizen, in our world. It automatically creates and manages a virtualenv for your projects, as well as adds/removes packages from your Pipfile as you install/uninstall packages.

ssh

ssh connect $ ssh userNameOnServer@<IP_ADDRESS_OF_SERVER> connects via ssh to the server on port 22. If the port is not 22, add portnumber via argument -p; $ ssh userNameOnServer@<IP_ADDRESS_OF_SERVER> -p <PORT_NBR> generate ssh key Generate your key; $ ssh-keygen -t rsa Follow the prompts - choose default location where keys will be saved. A ~/.ssh directory hase been created with your keys in it. The .pub is your public key an the other file is your private key.

Spark cluster how-to

This is how to get a Spark cluster running, with one master and one slave (worker). I used a droplet on digital ocean (master) and one local vagrant machine (slave). spark in vagrant References: thank you Bin kulwant singh austin ouyang Installation on all nodes (master and slaves) For the slave (worker) I used a vagrant machine as in my previous post. For the master I used a droplet on Digital Ocean.

Install jupyter with spark

For Spark installation in a vagrant instance, see previous post. Prerequisites You should have a working Spark installation inside a vagrant machine (Ubuntu Xenial 16.04). See previous post for doing that. Provision vagrant - expose ports In your folder, where your vagrant-machine is located; edit your Vagrantfile We will expose the ports for Jupyter and Spark to the host (you physical machine that runs vagrant). We will expose 2 ports - 8888 for the Jupyter notebooks and 7077 for the Spark master.

Spark installation in vagrant

This guide is how to install Spark, single node, in a Vagrant virtual machine. Reference: thank you Bin Ubuntu xenial This was done inside a vagrant machine; ubuntu xenial (16.04). $ vagrant init envimation/ubuntu-xenial $ vagrant up $ vagrant ssh and you’re in. Install openjdk $ sudo apt-get update && sudo apt-get install -y openjdk-8-jdk && ll /usr/lib/jvm/ && ll /usr/bin/java && ll /etc/alternatives/java && sudo update-alternatives –config java

Node Passport Login

nodejs with passport Creating a web application often involves some sort of login and user management. This is a short example of using nodejs with passport’s built in strategies for local (with mongodb) and google OAuth2. Passport, http://www.passportjs.org/, have strategies for almost every provider there is. Very handy. I won’t list all source code, you’ll find that in the Github repo. Instead, some highlights are shown below. Here is the Github repo with complete sourcecode.

tmux

Isn’t tmux just wonderful? Start sessions with processes and them attach to them at a later time. See this guide; A Quick and Easy Guide to tmux Memo; $ tmux #start new session $ tmux ls #list available sessions $ tmux attach -t 0 #attach to session 0, listed from tmux ls $ tumux kill-session -t 0 #kill session 0 Key bindings; C-b % #split pane vertically C-b "