- Developing a custom StreamSourceProvider
- Migrating TextSocketStream to SparkSession (currently uses SQLContext)
- Developing Sink and Source for Apache Kafka
- JDBC support (with PostgreSQL as the database)
- no upfront installation/agents on remote/slave machines - ssh should be enough
- application components should use third-party software, e.g. HDFS, Spark's cluster, deployed separately
- configuration templating
- environment requires/asserts, i.e. we need a JVM in a given version before doing deployment
- deployment process run from Jenkins
(by @andrestaltz)
If you prefer to watch video tutorials with live-coding, then check out this series I recorded with the same contents as in this article: Egghead.io - Introduction to Reactive Programming.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| /** | |
| * Part Zero : 10:15 Saturday Night | |
| * | |
| * (In which we will see how to let the type system help you handle failure)... | |
| * | |
| * First let's define a domain. (All the following requires scala 2.9.x and scalaz 6.0) | |
| */ | |
| import scalaz._ | |
| import Scalaz._ |
Copyright © 2017 Fantasyland Institute of Learning. All rights reserved.
A function is a mapping from one set, called a domain, to another set, called the codomain. A function associates every element in the domain with exactly one element in the codomain. In Scala, both domain and codomain are types.
val square : Int => Int = x => x * x