Leads and participates in the design, built and management of large scale data structures and pipelines and efficient Extract/Load/Transform (ETL) workflows. This is highly visible role with claims and finance internal stakeholder interface.
- Develops large scale data structures and pipelines to organize, collect and standardize data that helps generate insights and addresses reporting needs.
- Writes ETL (Extract / Transform / Load) processes, designs database systems and develops tools for real-time and offline analytic processing.
- Collaborates with data science team to transform data and integrate algorithms and models into automated processes.
- Uses knowledge in Hadoop architecture, HDFS commands and experience designing & optimizing queries to build data pipelines. Uses strong programming skills in Python, Java or any of the major languages to build robust data pipelines and dynamic systems. Builds data marts and data models to support Data Science and other internal customers.
- Integrates data from a variety of sources, assuring that they adhere to data quality and accessibility standards.
- Analyzes current information technology environments to identify and assess critical capabilities and recommend solutions. Experiments with available tools and advises on new tools in order to determine optimal solution given the requirements dictated by the model/use case.
- Ability to understand complex systems and solve challenging analytical problems.Experience with bash shell scripts, UNIX utilities & UNIX Commands.
- Knowledge in Hadoop architecture, HDFS commands and experience designing & optimizing queries against data in the HDFS environment.
- 4 or more years of progressively complex related experience.
- Ability to leverage multiple tools and programming languages to analyze and manipulate data sets from disparate data sources.
- Strong problem solving skills and critical thinking ability.
- Knowledge in Java, Python, Hive, Cassandra, Pig, MySQL or NoSQL or similar.
- Experience building data transformation and processing solutions.Has strong knowledge of large scale search applications and building high volume data pipelines.
- Strong collaboration and communication skills within and across teams.
- Master s degree preferred.
Vacancy Type: Full Time
Job Location: Hartford, CT, US
Application Deadline: N/A