Upload
19scottmiller
View
21
Download
7
Embed Size (px)
Citation preview
Big Data BootcampAustin Convention Center
Global Big Data Conference is offering 3 day extensive Bootcamp on Big Data.
Date: August 28th , 29th & 30th Venue : Austin Convention Center
Who Should Attend
Engineers
Developers
Networking specialists
Managers
Executives
Students
Professional Services
Architects
Data Analyst
BI Developer/Architect
QA
Performance Engineers
PM
Click Below Url to Register
http://globalbigdataconference.com/55/austin/big-data-bootcamp/attendee-registration.html
AgendaDays Big Data Track
Day1 Aug – 28th (7:30AM – 8:00PM)
•Registration•Introduction to Big Data & Spark •RDD, DAG, Spark Core•Scala Workshop•A Primer on Scaling NoSQL Technologies with an Emphasis on MongoDB•Spark SQL Workshop•Spark Streaming Workshop•Machine Learning with Spark - MLlib Workshop•Spark GraphX Workshop - Graph Data Processing•BlinkDB and Tachyon•Spark Wrap-up and Q&A•Use Case: Spark & Spark Streaming
AgendaDays Big Data Track
Day2Aug 27th (7:45AM – 8:00PM)
• Introduction to NoSQL Databases for Developers & Polyglot Persistence•HBase Workshop•Career in Analytics: An Analytical Approach to “tasting before eating” from the Analytics buffet•Keynote: New Features in Cassandra •Python data analytics stack •Cassandra Workshop•Neo4J Workshop
AgendaDays Big Data Track
Day3Aug – 30th (8:00AM – 8:00PM)
• Introduction to Hadoop •Hadoop Architecture•Setting up the development environment•Understand distributed file system (HDFS)b•Understand distributed processing framework (Map Reduce and Non Map Reduce) •Overview of Hadoop eco system (HDFS, Map Reduce, Hive/Pig, Sqoop, HBase, Spark, Impala etc)•Fast Time-Series Analytics with HBase and R•Selecting the Right Big Data Tool for the Right Job, & Making It Work for You•Hands on workshop to convert data into analytics•Introduction to Data Ingestion - Tips and Techniques•Overview of Hadoop eco system•Creating dashboard against data processed using Hadoop eco system •Tools used - HDFS, Hive, Hue, Tableau Public•Define your strategy to implement Hadoop, build a use case and maximize return