Skip to content

Instantly share code, notes, and snippets.

View jonaqp's full-sized avatar
🎯
Focusing

jonathan jonaqp

🎯
Focusing
View GitHub Profile
@jonaqp
jonaqp / h2o_flow.ipynb
Created April 3, 2019 15:26 — forked from ahmedengu/h2o_flow.ipynb
Installing and running H2O flow on google colab using and access it ngrok, Don't forget setting ngrok auth key
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@jonaqp
jonaqp / Boleta.xml
Created March 4, 2019 06:33 — forked from giansalex/Boleta.xml
Comprobantes Facturación Electrónica SUNAT UBL 2.1 (Factura, Boleta, Nota de Crédito, Nota de Débito), generados con Greenter https://giansalex.github.io/greenter
<?xml version="1.0" encoding="UTF-8"?>
<Invoice xmlns="urn:oasis:names:specification:ubl:schema:xsd:Invoice-2" xmlns:cac="urn:oasis:names:specification:ubl:schema:xsd:CommonAggregateComponents-2" xmlns:cbc="urn:oasis:names:specification:ubl:schema:xsd:CommonBasicComponents-2" xmlns:ds="http://www.w3.org/2000/09/xmldsig#" xmlns:ext="urn:oasis:names:specification:ubl:schema:xsd:CommonExtensionComponents-2">
<ext:UBLExtensions>
<ext:UBLExtension />
</ext:UBLExtensions>
<cbc:UBLVersionID>2.1</cbc:UBLVersionID>
<cbc:CustomizationID>2.0</cbc:CustomizationID>
<cbc:ID>B001-1</cbc:ID>
<cbc:IssueDate>2018-10-15</cbc:IssueDate>
<cbc:IssueTime>00:44:53</cbc:IssueTime>
@jonaqp
jonaqp / data_loader.py
Created February 8, 2019 12:04 — forked from kevinzakka/data_loader.py
Train, Validation and Test Split for torchvision Datasets
"""
Create train, valid, test iterators for CIFAR-10 [1].
Easily extended to MNIST, CIFAR-100 and Imagenet.
[1]: https://discuss.pytorch.org/t/feedback-on-pytorch-for-kaggle-competitions/2252/4
"""
import torch
import numpy as np
#!/bin/bash
# License: Public Domain.
echo "export PYSPARK_PYTHON=python3" | tee -a /etc/profile.d/spark_config.sh /etc/*bashrc /usr/lib/spark/conf/spark-env.sh
echo "export PYTHONHASHSEED=0" | tee -a /etc/profile.d/spark_config.sh /etc/*bashrc /usr/lib/spark/conf/spark-env.sh
echo "spark.executorEnv.PYTHONHASHSEED=0" >> /etc/spark/conf/spark-defaults.conf
# Only run on the master node
ROLE=$(/usr/share/google/get_metadata_value attributes/dataproc-role)
if [[ "${ROLE}" == 'Master' ]]; then
@jonaqp
jonaqp / install-py3-dataproc.sh
Created May 30, 2018 11:43 — forked from cerisier/install-py3-dataproc.sh
Dataproc initialization action script for installing python3
#!/bin/bash
# from https://gist.github.com/nehalecky/9258c01fb2077f51545a/raw/789f08141dc681cf1ad5da05455c2cd01d1649e8/install-py3-dataproc.sh
apt-get -y install python3
echo "export PYSPARK_PYTHON=python3" | tee -a /etc/profile.d/spark_config.sh /etc/*bashrc /usr/lib/spark/conf/spark-env.sh
echo "Adding PYTHONHASHSEED=0 to profiles and spark-defaults.conf..."
echo "export PYTHONHASHSEED=0" | tee -a /etc/profile.d/spark_config.sh /etc/*bashrc /usr/lib/spark/conf/spark-env.sh
echo "spark.executorEnv.PYTHONHASHSEED=0" >> /etc/spark/conf/spark-defaults.conf
@jonaqp
jonaqp / r_ubuntu_17_10.sh
Created May 7, 2018 21:01 — forked from pachadotdev/r_ubuntu_17_10.sh
Install R on Ubuntu 17.10
# Install R
sudo apt-get update
sudo apt-get install gdebi libxml2-dev libssl-dev libcurl4-openssl-dev libopenblas-dev r-base r-base-dev
# Install RStudio
cd ~/Downloads
wget https://download1.rstudio.org/rstudio-xenial-1.1.383-amd64.deb
sudo gdebi rstudio-xenial-1.1.383-amd64.deb
printf '\nexport QT_STYLE_OVERRIDE=gtk\n' | sudo tee -a ~/.profile
@jonaqp
jonaqp / console output
Created April 26, 2018 07:15 — forked from mitallast/console output
Example Naive Bayes Classifier with Apache Spark Pipeline
+--------+--------------------+-----+--------------------+--------------------+--------------------+--------------------+----------+
|category| text|label| words| features| rawPrediction| probability|prediction|
+--------+--------------------+-----+--------------------+--------------------+--------------------+--------------------+----------+
| 3001|Плойки и наборы V...| 24.0|[плойки, и, набор...|(10000,[326,796,1...|[-174.67716870697...|[6.63481663197049...| 24.0|
| 833|"Чехол-обложка дл...| 1.0|["чехол-обложка, ...|(10000,[514,986,1...|[-379.37151502387...|[5.32678001676623...| 1.0|
| 833|"Чехол-обложка дл...| 1.0|["чехол-обложка, ...|(10000,[514,986,1...|[-379.84825219376...|[2.15785456821554...| 1.0|
| 833|"Чехол-обложка дл...| 1.0|["чехол-обложка, ...|(10000,[290,514,9...|[-395.42735009477...|[6.44323423370500...| 1.0|
| 833|"Чехол-обложка дл...| 1.0|["чехол-обложка, ...|(10000,[290,514,9...|[-396.10251348

How to add Python virtualenv Notebook

Install jupyter

$pip install -U jupyter

Create virtual environment

@jonaqp
jonaqp / compressMe.py
Created November 13, 2017 22:24 — forked from ShantanuJoshi/compressMe.py
Python Image Compress
#run this in any directory add -v for verbose
#get Pillow (fork of PIL) from pip before running --> pip install Pillow
import os
import sys
from PIL import Image
def compressMe(file, verbose=False):
filepath = os.path.join(os.getcwd(), file)
oldsize = os.stat(filepath).st_size
@jonaqp
jonaqp / model.py
Created September 19, 2017 22:54 — forked from BideoWego/model.py
Simple Python Model class for use with SQLite3
import sqlite3
class Model:
def __init__(self, db, table):
self.db = db
self.table = table
self.connection = sqlite3.connect(db + '.db')
self.connection.row_factory = sqlite3.Row
def create(self, row):