We offer mainly workshops (4-hours) that will allow you to get started with Hadoop, Spark, Cassandra and Machine Learning.

Available workshops

Intro to Spark

The main focus will be to understand Spark main characteristics : RDD, Transformations, Actions and to see how easy we can deploy Spark and learn it at home.


You need to have your own laptop and basic knowledge about chosen programming language (Java, Python, Scala). All the details for Spark installation and work environment will be provided with at least one week before the workshop.

Machine Learning & Spark

You will be able to learn about Data pre-processing, Clustering & Classification algorithms, GradientBoostedTrees and
Topic analysis.


Knowledge of java or scala programming is a mustas well as spark basics (RDD, map/reduce operators) and maven.

Hadoop MapReduce and Spark training

Avada offers incredible elements that allow you to create a beautiful site for your online presence.

Build Something Beautiful

Start your own Hadoop or Spark cluster, pump data into it and start the analysis.

Future workshops (from may 2016 onwards)

      • Intro to Big Data Workshop
      • Big Data for the structured minds ( Spark SQL)
      • An end to end Big Data case
      • Python for Big Data

Trainings – only on request

For technology trainings please contact us.

Contact us nowContact us now