Hadoop Developer (Java Bakground Exp) Job at Cloud Analytics Technologies LLC, Dearborn, MI

RHdNL1YwVlFvNmRQYVV4dTB6STRiUXFUYWc9PQ==
  • Cloud Analytics Technologies LLC
  • Dearborn, MI

Job Description

Job Description

Title: Hadoop Developer
Location: Dearborn, Michigan
Duration: Contract

Job Summary: The Hadoop Developer position will provide expertise in a wide range of technical areas, including but not limited to: Cloudera Hadoop ecosystem, Java, collaboration toolsets integration using SSO, configuration management, hardware and software configuration and tuning, software design and development, and application of new technologies and languages that are aligned with other FordDirect internal projects. Preferred past/current experience on a CRM integration assignment.

Essential Job Functions:
1.Design and development of data ingestion pipelines.
2.Perform data migration and conversion activities.
3.Develop and integrate software applications using suitable development methodologies and standards, applying standard architectural patterns, taking into account critical performance characteristics and security measures.
4.Collaborate with Business Analysts, Architects and Senior Developers to establish the physical application framework (e.g. libraries, modules, execution environments).
5.Perform end to end automation of ETL process for various datasets that are being ingested into the big data platform.
Required:
1.Java/J2EE
2.Web Applications, Tomcat (or any equivalent App server) , RESTful Services, JSON
3.Spring, Spring Boot, Struts, Design Patterns
4.Hadoop (Cloudera (CDH)) , HDFS, Hive, Impala, Spark, Oozie, HBase
5.SCALA
6.SQL
7.Linux
Good to Have:
8. Google Analytics, Adobe Analytics
9. Python, Perl
10. Flume, Solr
11. Strong Database Design Skills
12. ETL Tools
13. NoSQL databases (Mongo, Couchbase, Cassandra)
14. JavaScript UI frameworks (Angular, NodeJS, Bootstrap)
15. Good understanding and working knowledge of Agile development

Other Responsibilities:
1.Document and maintain project artifacts.
2.Suggest best practices, and implementation strategies using Hadoop, Java, ETL tools.
3.Maintain comprehensive knowledge of industry standards, methodologies, processes, and best practices.
4.Other duties as assigned.

Minimum Qualifications and Job Requirements:
1.Must have a Bachelor s degree in Computer Science or related IT discipline.
2.Must have at least 5 years of IT development experience.
3.Must have strong, hands-on J2EE development
4.Must have in-depth knowledge of SCALA Spark programming
5.Must have 3+ years relevant professional experience working with Hadoop (HBase, Hive, MapReduce, Sqoop, Flume) Java, JavaScript, .Net, SQL, PERL, Python or equivalent scripting language  Must have experience with ETL tools
6.Must have experience integrating web services
7.Knowledge of standard software development methodologies such as Agile and Waterfall
8.Strong communication skills.
9.Must be willing to flex work hours accordingly to support application launches and manage production outages if necessary

Specific Knowledge, Skills and Abilities:
Ability to multitask with numerous projects and responsibilities
Experience working with JIRA and WIKI
Must have experience working in a fast-paced dynamic environment.
Must have strong analytical and problem solving skills.
Must have good verbal and written communication skills
Must be able and willing to participate as individual contributor as needed.
Must have ability to work the time necessary to complete projects and/or meet deadlines.

Job Tags

Contract work, Flexible hours,

Similar Jobs

Workpath

Remote Work From Home Part Time Data Entry Job at Workpath

Remote Work From Home Part Time Data Entry / Typing This is your opportunity to begin a lifelong career with endless opportunity. Discover the freedom you've been looking for by taking a moment to complete our online application. Benefits: Excellent weekly pay ...

ASB Resources

ETL Data Engineer Job at ASB Resources

 ...Experience with various ETL frameworks. Experience with Dremio and Kafka. Apigee experience for API management. data modeling, metadata management, data lineage and data dictionary Implementing data pipelines and integrating data from various... 

AP4 Group

Warehouse Manager Job at AP4 Group

 ...Distribution, a Division of AP4 Group, is seeking a full-time Warehouse Manager AP4 Group is a global full-service provider of heavy...  ...hiring, and training of employees. Organize and oversee the work and schedule of employees. Ensure customers receive timely responses... 

Saint Mary's School

PK to 8 Physical Education and Health Teacher Job at Saint Mary's School

St. Mary's School, Oak Ridge, is seeking a joyful, faith-filled PK-8th grade Physical Education/Health teacher for the 2025-2026 school year. The work of the Catholic School teacher begins with the mission and belief of St. Mary's School. Teaching in a Catholic school is... 

Newport Associates

(Remote) Scheduling Assistant Job at Newport Associates

 ...Location: Work from Home Job Type: Full-Time or Part-Time Are you ready to learn new skills, work from home, and turn your passion for travel into a meaningful career? Were looking for motivated, detail-oriented individuals to join our team. No prior...