Sameer Farooqui Expert Profile
Most recently he was a Systems Architect at Hortonworks where he specialized in designing Hadoop prototypes and Proof-of-Concept use cases. While at Hortonworks, Sameer also taught Hadoop Developer's classes and visited various customers as a sales engineer to brainstorm use cases. The core Hadoop products he specializes in are HDFS, MapReduce, Pig, Hive, HBase and Zookeeper.
Previously, Sameer worked at Accenture's Silicon Valley R&D lab where he was responsible for studying NoSQL databases, Cloud Computing and MapReduce for their commercial applicability to emerging big data problems. At Accenture Tech Labs, Sameer was the lead engineer for creating a 32-node prototype using Cassandra and AWS to host 10 TB of Smart Grid data. He also worked on a 30+ person team in the design phase of a multi-environment Hadoop cluster pilot project at NetApp.
Before Hortonworks and Accenture, Sameer spent five years at Symantec where he deployed Veritas Clustering and Storage Foundation solutions (VCS, VVR, SF-HA) to Fortune 500 and government clients throughout North America.
Sameer is a regular speaker at Big Data conferences and meetups.Speaking Engagements:
- June 2013 | Cassandra Summit | SF | Comparing Architectures: Cassandra vs. the Field
- March 2013 | Big Data Tech Day | NYC | How to Analyze the Human Genome with Cassandra
30-minute sample from Sameer's Hadoop training class:
Sameer Farooqui is scheduled to teach the following classes:
|Hadoop Online for Admins and Developers||Aug 26 - Aug 30||Online Classroom|
|Hadoop for Admins and Developers||Oct 1 - Oct 3||San Francisco|