Hey there!
Are you passionate passionate about Internet of Any Thing (IoAT), Data Science and Visualizations?
Hortonworks is a leading open source technology company powering the next generation of data applications. Our 100% open source platforms, powered by Apache Hadoop and Apache NiFI, provide an open and stable foundation for enterprises to build and deploy big data solutions. Hortonworks is the trusted source of information for Open Source Data projects and along with the Apache community is making the extended Hadoop ecosystem robust and easier to manage and use. Hortonworks provides unmatched technical support, training and certification programs for enterprises, systems integrators and technology vendors. For more information, visit www.hortonworks.com
Hortonworks Data Platform (HDP) can handle massive data and it has all the tools to get you quickly up and running creating awesome projects. Combine HDP with Hortonworks Data Flow (HDF) powered by Apache NiFi, and you have a powerful graphical interface to manage data flow. On top of it all, you can use Apache Zeppelin notebooks to quickly prototype and visualize your ideas.
If you're new to this, don't worry. We have a step-by-step tutorial that will guide you along and introduce you to everything.
The best part? You'll be working with cutting edge tools and you'll earn bragging rights to say you have built an Internet of Any Thing (IoAT) platform running sentiment analysis with awesome visualizations!
We’ll have mentors ready to assist you in person or via Slack, and we’ll be posting any updates via Devpost as well.
Finally, we have great prizes up to $7200!
FYI, we're Hiring Interns! Stop by our booth to find out more.
YOUR MISSION
Here's our Twitter Sentiment Analysis tutorial using HDP, NiFi and few other great tools: Analyzing Twitter Data with Apache NiFi and HDP Search.
If you are completely new to Hadoop you might want to consider reviewing the Hello to Hadoop tutorial.
Your mission, should you choose to accept it, is to either run Twitter sentiment analysis on an interesting topic, such as 2016 presidential candidates or specific stocks, enhance the sentiment analysis and/or to create new and unique visualisations using Zeppelin notebooks or other visualization tools such as D3.js.
So be creative and build something amazing!
Follow these links to get started:
- Analyzing Twitter Data with Apache NiFi and HDP Search Tutorial
- Hortonworks Data Platform (HDP) Overview
- Hortonworks Data Flow (HDF) Overview
If you have a machine with at least 12GB of RAM memory available then you may download, install, and run HDP and HDF on your laptop/PC. Note that 8GB of RAM memory is required for the HDP Sandbox Virtual Box image. To run locally follow these links:
- HDP Sandbox On Premise Deployment Overview
- HDP Download (large file* 9GB+)
-
HDF Download
* If file download is slow, please stop by the Hortonworks booth and we’ll have all the files ready for you on a USB stick for a quick transfer.
If you DO NOT have a machine with at least 12GB of RAM memory, we highly recommend that you run your project in the Microsoft Azure Cloud. See link below for more info:
Finally, make sure to checkout Hortonworks Community Connection; it's a great resource for Hadoop and data related questions should you need expert advice from the larger community!
Eligibility
All Stanford Winter 2016 TreeHacks contestants are eligible to participate, subject to any applicable restrictions in the Rules section.
Requirements
- Project must be running on HDP and/or HDF with Apache NiFi or both. It can be either an on premise (local) deployment or running in the cloud.
Note: For an on premise/local deployment, you should have 8GB of RAM memory available for the Virtual Box image. It is recommended, then, that you have at least 12GB of RAM available on your laptop/PC.
Note: Should you not have sufficient RAM memory available, it is highly recommended that you run HDP and/or HDF in the Microsoft Azure cloud. - Submit a PDF describing all the steps, code, screenshots, links to external software/plugins, and any other relevant information necessary to replicate your project. All submissions must be made via Devpost* by 12PM PST on February 14, 2016.
- Demo a working project in front of the Hortonworks judges. Each team will have up to five (5) minutes to demo their project.
In the event that there are problems with submissions via Devpost, send your PDF to hackathon@hortonworks.com or visit the Hortonworks table and we’ll handle it from there.
Prizes
$7,200 in prizes
Up to $1200 Total Cash prize
Winning team receives a total cash prize up to $1200. ($300 per each team member.)
Up to four 2016 San Jose Hadoop Summit Passes
Winning team receives up to four 2016 San Jose Hadoop Summit Passes. One pass per team member. Non-transferable. Travel, lodging, and any other per diem costs related to attendance at 2016 San Jose Hadoop Summit are not covered by Hortonworks.
Team Presentation of Winning Idea
Winning team is invited to present their idea at the the 2016 San Jose Hadoop Summit.
Devpost Achievements
Submitting to this hackathon could earn you:
How to enter
- Register with this hackathon via Devpost; and
- Send an email to hackathon@hortonworks.com with the following info:
In the email Subject line include: TreeHacks 2016
In the email Body include the following:
- Your Team Name and all the team members.
- Where your team is located (so we can easily find you).
Once submitted, we'll send you a confirmation email and you can start hacking away.
Note: In the event that you do not receive an email confirmation, please resend your email or visit us at the Hortonworks table and we'll confirm in person.
Follow these links to get started:
- Analyzing Twitter Data with Apache NiFi and HDP Search Tutorial
- Hadoop Data Platform (HDP) Overview
- Hadoop Data Flow (HDF) Overview
If you have a machine with at least 12GB of RAM memory available then you may download, install, and run HDP and HDF on your laptop/PC. Note that 8GB of RAM memory is required for the HDP Sandbox Virtual Box image. To run locally follow these links:
- HDP Sandbox On Premise Deployment Overview
- HDP Download (large file* 9GB+)
- HDF Download
* If file download is slow, please stop by the Hortonworks booth and we’ll have all the files ready for you on a USB stick for a quick transfer.
If you DO NOT have a machine with at least 12GB of RAM memory, we highly recommend that you run your project in the Microsoft Azure Cloud. See link below for more info:
Finally, make sure to checkout Hortonworks Community Connection; it's a great resource for Hadoop and data related questions should you need expert advice from the larger community!
Judges

Robert Hryniewicz
Data Evangelist / Hortonworks

Ali Bajwa
Solutions Engineer / Hortonworks

Alejandro Fernandez
Coding Ninja / Hortonworks

Sumit Mohanty
Apache Ambari PMC / Hortonworks
Judging Criteria
-
Evaluation
Submitted entries and demos will be evaluated by each judge and assigned a Final Score out of 100 total available points. Maximum of 25 points will be awarded for each of the following four categories: -
Working Project (25 points)
25 points max for working project running on HDP and/or HDF on premise (local) or in the cloud: It doesn't have to be complete, just enough to fill in the gaps and be able to demo. -
Impact (25 points)
25 points max for impact: Great ideas that have a meaningful contribution, purpose, and value. -
Visualization & UI (25 points)
25 points max for visualization & UI: Let's see it to believe it. -
Uniqueness (25 points)
25 points max for uniqueness: We want to inspire original and new ideas.
Questions? Email the hackathon manager
Tell your friends
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.