• Sr. Hadoop Big Data Engineer

    Job Location US-MI-Detroit
    Job ID
    Information Technology
    Business Line
    CMC Corporate
  • Overview

    Who we are:

    Meridian, a WellCare Company, is part of a national network of passionate leaders, achievers, and innovators dedicated to making a difference in the lives of our members, our providers and in the healthcare industry.

    We provide government-based health plans (Medicare, Medicaid, and the Health Insurance Marketplace) in Michigan, Illinois, Indiana, and Ohio. As a part of the WellCare Family of companies, we deliver healthcare excellence to millions of members nationwide.

    Our associates work hard, play hard, and give back. Meridian associates enjoy an exceptional experience and culture including special events, company sports teams, potlucks, Bagel Fridays, and volunteer opportunities.


    A Day in the Life of a Sr. Hadoop Big Data Engineer:


    This is an exciting opportunity to use your varied skills and to wear many hats while helping Meridian to build out its leading edge Hadoop and big data environment.  Our goal is to leverage modern and emerging platforms to house our data for the many reporting and analytics needs that we have.  In this role you will be at the forefront of building out the environment, inventing new processes, configuring and developing components where necessary, and being an all-around Hadoop generalist.   If you enjoy being asked to do a variety of things, then this role will fill that need!




    • Architect and design the Hadoop architecture that Meridian will implement
    • Oversee the installation and configuration Hadoop components and related utilities
    • Design and develop processes for source data ingestion, transformation, and database loading
    • Design and develop processes for data quality monitoring
    • Design and develop processes to support “Data as a Service (DaaS)”

    Design file system management, monitoring, support and maintenance for HDFS, KUDU, and HBASE

    • Design and develop scalable and high-performance web services for data tracking
    • Design and develop data lake structures and processes
    • Establish and oversee security policies and procedures for the Hadoop environment
    • Lead the development and implementation of various strategic initiatives.
    • Lead the development of Architecture Policies, Standards and Governance for the Hadoop and Big Data environment
    • Lead research and development with promising leading edge big data technologies
    • Actively participates in data architecture design and review processes, including planning and monitoring efforts, reviewing deliverables, and communicating to management
    • Responding to change and engaging in multiple projects simultaneously
    • Works with minimal guidance; seeks guidance on only the most complex tasks
    • Instructs and mentors other Hadoop engineers


    • Bachelor’s degree in related field is required
    • 7+ years of experience as an IT professional
    • 5+ years of programming with Java and/or Python
    • 3+ years working with Hadoop (Cloudera or Horton Works or MAPR)
    • 5+ years working with data design or database development
    • Experience with reporting tools such as Tableau, Qlikview, Datameer, etc would be a plus
    • Prior experience in a complex, highly integrated services environment
    • Working knowledge of Red Hat LINUX
    • Expert in multi-threading and concurrency concepts
    • Deep understanding of and experience developing in Hadoop
    • Deep working knowledge of Kafka, Flume, Hive, Spark, Impala, Sqoop Oozie, Hbase, Zookeeper, HUE
    • Expert level SQL knowledge and experience with a relational database
    • Working knowledge of Pig Latin, HiveQL, Python and/or Java
    • Substantial understanding of reporting and analytics tools
    • Deep experience working with data lakes
    • Deep knowledge of pre and post installation of Hadoop software and good understanding of Hadoop ecosystems dependencies
    • Experience implementing data ingress and egress-Facilitating generic input/output, moving bulk data into and out of Hadoop
    • Expertise in setting up, configuration and management of data security
    • On-going support for various Hadoop environments - DEMO, TEST, UAT, and PROD;
    • Working knowledge of Cloudera (or Hortonworks or MAPR), Kafka, Spark, data lakes, and Flume is required
    • Excellent written and oral communication skills
    • Proficiency in MS Office (Word, Excel, Outlook, PowerPoint, Access, Visio)
    • Strong consultative and presentation skills
    • Assertive (yet diplomatic) personality and the ability to handle change
    • Candidates need to work well in extended project teams with little oversight
    • Candidates need to have leadership capability


    What Meridian can offer you:

    • Our healthcare benefits include a variety of plans that are effective on the first day of employment for our new full-time team members.
    • Opportunity to work with the industry’s leading technologies and participate in unique projects, demonstrations, conferences, and exclusive learning opportunities.
    • Meridian offers 401k matching that is above the national average.
    • Full-time Meridian employees are eligible for tuition reimbursement towards Bachelor’s or Master’s degrees.
    • Meridian was named Detroit's #1 Fastest Growing Company by Crain's Magazine, so it is a great time to get involved with Meridian.








    Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
    Share on your newsfeed

    Connect With Us!

    Not ready to apply? Connect with us for general consideration.