• Hadoop/Apache Spark Engineer

    Location US-IL-Chicago
    Job ID
    49342
    Function
    Engineering
  • Overview

    Want to develop cool technology that helps our clients make smarter decisions?  As a member of the Enterprise Intelligence Software business unit, a high-growth, incubator business backed by the financial strength of Zebra Technologies, you will be instrumental in creating new breakthrough products.  Our motto is "Have Fun At Work" and we want employees who enjoy their work. We collectively work together in agile teams to drive our business forward.  We are looking for collaborative individuals to join our family and contribute to our success. Inspired? Motivated?  Think you can do it? Come join us!

     

    The candidate selected will be part of a larger product development team responsible for building reusable components and solutions. The team members work with data available from mobile devices, sensors and various support related source systems.  The team works with cutting edge technologies like Spark, Hadoop, HDFS, Cassandra, Elasticsearch. Team also has the opportunity to work with Predictive analytics algorithms to come up with actionable insights using various machine learning tool sets. Our customers are spread out over many domains including transportation & logistics, healthcare, retail and manufacturing.

    Responsibilities

    Working within a cross functional, agile development team, this Hadoop/Apache Spark Engineer will be responsible for hands on development and technical delivery and support of innovative components used in data driven solutions. The candidate will have a sound working knowledge of Apache Spark, Scala, Java, Spring, NoSQL, SQL and other tools. The candidate should be self-motivated and be able to drive through rapidly changing requirements typically found in innovative business units.

    Qualifications

    • 5+ Years of experience with Apache Spark.
    • 10+ years of combined experience in Java/Scala environment
    • Minimum 2-3 years of strong experience with Scala is a must.
    • Experience with functional programming.
    • Experience with Java 8 environment.
    • Experience with hive/beeline, SparkSQL.
    • Experience with Cassandra and elasticsearch.
    • Experience with oozie and Hue.
    • Experience with building REST APIs using Spring Boot.
    • Should be able to ramp up quickly and work independently with minimum supervision
    • Should have strong troubleshooting skills to reach to the bottom of production problems.
    • U.S. Only:
      • Preferred Education: Bachelor's or Masters degree in an appropriate engineering discipline required.
      • Preferred Work Experience (years): Bachelors degree and 5+ years experience or Masters degree and 3+ years of engineering experience.
    • All other Regions: 
      • Preferred Education: Bachelor’s degree
      • Preferred Work Experience (years): 8+ years work experience

    Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
    Share with your network