Industry
Specialization Or Business Function
Technical Function Data Management, Data Engineering
Technology & Tools Big Data and Cloud (Apache Hadoop, Apache Hive, Apache Pig, Hadoop MapReduce, Apache Spark), Programming Languages and Frameworks (Java, SQL, Python)
Experfy is hiring on behalf of a Fortune 100 client. This is a one-year contract in Cupertino, CA and allows you to work at one of the most innovative companies in the world.
Key Responsibilities
Support our team with the development of high performance data pipelines.
Qualifications
Required:
5+ years of experience in Core JAVA and SQL
3+ years or experience in Python & Unix Shell Scripting
3+ years of experience in building scalable and high performance data pipelines using Apache Hadoop, Map Reduce, Pig & Hive
Experience with big data cross platform compatible file formats like Apache Avro & Apache Parquet
Experience in Apache Spark is a plus
Preferred:
Hadoop Certification or Spark Certification
Please respond with your resume and a crisp summary of your qualifications.
Only candidates who are authorized to work in the US will be considered.
Matching Providers