Full-time

Job Description

We are looking for a savvy Data Engineer to join our growing team of analytics experts. The hire will be responsible for expanding and optimizing our data pipeline architecture.

The ideal candidate is an experienced data pipeline builder and data wrangler with strong experience in handling data at scale.

The Data Engineer will support our software developers, data analysts and data scientists on various data initiatives.

This is a remote role that can be done anywhere in the continental US; work is on Eastern time zone hours.

Why this role

This is a highly visible role within the enterprise data lake team. Working within our Data group and business analysts, you will be responsible for leading creation of data architecture that produces our data assets to enable our data platform.

This role requires working closely with business leaders, architects, engineers, data scientists and wide range of stakeholders throughout the organization to build and execute our strategic data architecture vision.

Job Duties

  • Extensive understanding of SQL queries. Ability to fine tune queries based on various RDBMS performance parameters such as indexes, partitioning, Explain plans and cost optimizers.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS technologies stack
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
  • Working with data scientists and industry leaders to understand data needs and design appropriate data models.
  • Participate in the design and development of the AWS-based data platform and data analytics.

Qualifications

Skills Needed

  • Design and implement data ETL frameworks for secured Data Lake, creating and maintaining an optimal pipeline architecture.
  • Examine complex data to optimize the efficiency and quality of the data being collected, resolve data quality problems, and collaborate with database developers to improve systems and database designs
  • Hands-on building data applications using AWS Glue, Lake Formation, Athena, AWS Batch, AWS Lambda, Python, Linux shell & Batch scripting.
  • Hands on experience with AWS Database services (Redshift, RDS, DynamoDB, Aurora etc.)
  • Experience in writing advanced SQL scripts involving self joins, windows function, correlated subqueries, CTE’s etc.
  • An understanding of data management fundamentals, including concepts such as data dictionaries, data models, validation, and reporting.

Education and Training

  • Minimum of 5 years full-time software engineering experience with at least 2 years in an AWS environment focused on application development.
  • Bachelor’s degree or foreign equivalent degree in Computer Science, Software Engineering, or related field

LI-LM03

Additional Information

In 2022, Verisk received Great Place to Work® Certification for our outstanding workplace culture for the sixth year in a row and second-time certification in the UK, Spain, and India.

We’re also one of the 38 companies on the UK’s Best Workplaces™ list and one of 18 companies on Spain’s Best Workplaces™ list.

For over fifty years and through innovation, interpretation, and professional insight, Verisk has replaced uncertainty with precision to unlock opportunities that deliver significant and demonstrable impact.

From our historic roots in risk assessment, we’ve grown to provide analytic insights that help transform industries focused on some of the world’s most critical areas.

Today, the insurance industry relies on Verisk to be, and to make the world, more productive, resilient, and sustainable.

Verisk works in collaboration with our customers and at the intersection of people, data, and advanced technologies. Through proprietary platformed analytics, advanced modeling, and interpretation, we deliver immediate and sustained value to our customers and through them, to the individuals and societies they serve, with greater speed, precision, and scale.

We’re 9,000 people strong, committed to translating big data into big ideas. We help others see new possibilities and empower certainty into big decisions that impact individuals and societies.

And we relentlessly and ethically pursue innovation to help move our customers, and the world, toward better tomorrows.

Everyone at Verisk from our chief executive officer to our newest employee is guided by The Verisk Way, to Be Remarkable, Add Value, and Innovate.

  • Be Remarkable by doing something better each day in service to our customers and each other
  • Innovate by redefining what’s possible, embracing challenges, and pushing boundaries

Verisk Businesses

Underwriting Solutions provides underwriting and rating solutions for auto and property, general liability, and excess and surplus to assess and price risk with speed and precision

Claims Solutions supports end-to-end claims handling with analytic and automation tools that streamline workflow, improve claims management, and support better customer experiences

Property Estimating Solutions offers property estimation software and tools for professionals in estimating all phases of building and repair to make day-to-day workflows the most efficient

Extreme Event Solutions provides risk modeling solutions to help individuals, businesses, and society become more resilient to extreme events.

Specialty Business Solutions provides an integrated suite of software for full end-to-end management of insurance and reinsurance business, helping companies manage their businesses through efficiency, flexibility, and data governance

Marketing Solutions delivers data and insights to improve the reach, timing, relevance, and compliance of every consumer engagement

Life Insurance Solutions offers end-to-end, data insight-driven core capabilities for carriers, distribution, and direct customers across the entire policy lifecycle of life and annuities for both individual and group.

Verisk Maplecroft provides intelligence on sustainability, resilience, and ESG, helping people, business, and societies become stronger

At Verisk you can build an exciting career with meaningful work; create positive and lasting impact on business; and find the support, coaching, and training you need to advance your career.

We have received the Great Place to Work® Certification for the 7th consecutive year. We’ve been recognized by Forbes as a World’s Best Employer and a Best Employer for Women, testaments to our culture of engagement and the value we place on an inclusive and diverse workforce.

Verisk’s Statement on Racial Equity and Diversity supports our commitment to these values and affecting positive and lasting change in the communities where we live and work.

Verisk Analytics is an equal opportunity employer.

All members of the Verisk Analytics family of companies are equal opportunity employers. We consider all qualified applicants for employment without regard to race, religion, color, national origin, citizenship, sex, gender identity and / or expression, sexual orientation, veteran's status, age or disability.

Unsolicited resumes sent to Verisk, including unsolicited resumes sent to a Verisk business mailing address, fax machine or email address, or directly to Verisk employees, will be considered Verisk property.

Verisk will NOT pay a fee for any placement resulting from the receipt of an unsolicited resume.

HR CCPA Privacy

Apply Now

Related Jobs

Data engineer

Verisk New York, NY
APPLY

Job Description

We are looking for a savvy Data Engineer to join our growing team of analytics experts. The hire will be responsible for expanding and optimizing our data pipeline architecture.

The ideal candidate is an experienced data pipeline builder and data wrangler with strong experience in handling data at scale.

The Data Engineer will support our software developers, data analysts and data scientists on various data initiatives.

This is a remote role that can be done anywhere in the continental US; work is on Eastern time zone hours.

Why this role

This is a highly visible role within the enterprise data lake team. Working within our Data group and business analysts, you will be responsible for leading creation of data architecture that produces our data assets to enable our data platform.

This role requires working closely with business leaders, architects, engineers, data scientists and wide range of stakeholders throughout the organization to build and execute our strategic data architecture vision.

Job Duties

  • Extensive understanding of SQL queries. Ability to fine tune queries based on various RDBMS performance parameters such as indexes, partitioning, Explain plans and cost optimizers.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS technologies stack
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
  • Working with data scientists and industry leaders to understand data needs and design appropriate data models.
  • Participate in the design and development of the AWS-based data platform and data analytics.

Qualifications

Skills Needed

  • Design and implement data ETL frameworks for secured Data Lake, creating and maintaining an optimal pipeline architecture.
  • Examine complex data to optimize the efficiency and quality of the data being collected, resolve data quality problems, and collaborate with database developers to improve systems and database designs
  • Hands-on building data applications using AWS Glue, Lake Formation, Athena, AWS Batch, AWS Lambda, Python, Linux shell & Batch scripting.
  • Hands on experience with AWS Database services (Redshift, RDS, DynamoDB, Aurora etc.)
  • Experience in writing advanced SQL scripts involving self joins, windows function, correlated subqueries, CTE’s etc.
  • An understanding of data management fundamentals, including concepts such as data dictionaries, data models, validation, and reporting.

Education and Training

  • Minimum of 5 years full-time software engineering experience with at least 2 years in an AWS environment focused on application development.
  • Bachelor’s degree or foreign equivalent degree in Computer Science, Software Engineering, or related field

LI-LM03

Additional Information

In 2022, Verisk received Great Place to Work® Certification for our outstanding workplace culture for the sixth year in a row and second-time certification in the UK, Spain, and India.

We’re also one of the 38 companies on the UK’s Best Workplaces™ list and one of 18 companies on Spain’s Best Workplaces™ list.

For over fifty years and through innovation, interpretation, and professional insight, Verisk has replaced uncertainty with precision to unlock opportunities that deliver significant and demonstrable impact.

From our historic roots in risk assessment, we’ve grown to provide analytic insights that help transform industries focused on some of the world’s most critical areas.

Today, the insurance industry relies on Verisk to be, and to make the world, more productive, resilient, and sustainable.

Verisk works in collaboration with our customers and at the intersection of people, data, and advanced technologies. Through proprietary platformed analytics, advanced modeling, and interpretation, we deliver immediate and sustained value to our customers and through them, to the individuals and societies they serve, with greater speed, precision, and scale.

We’re 9,000 people strong, committed to translating big data into big ideas. We help others see new possibilities and empower certainty into big decisions that impact individuals and societies.

And we relentlessly and ethically pursue innovation to help move our customers, and the world, toward better tomorrows.

Everyone at Verisk from our chief executive officer to our newest employee is guided by The Verisk Way, to Be Remarkable, Add Value, and Innovate.

  • Be Remarkable by doing something better each day in service to our customers and each other
  • Innovate by redefining what’s possible, embracing challenges, and pushing boundaries

Verisk Businesses

Underwriting Solutions provides underwriting and rating solutions for auto and property, general liability, and excess and surplus to assess and price risk with speed and precision

Claims Solutions supports end-to-end claims handling with analytic and automation tools that streamline workflow, improve claims management, and support better customer experiences

Property Estimating Solutions offers property estimation software and tools for professionals in estimating all phases of building and repair to make day-to-day workflows the most efficient

Extreme Event Solutions provides risk modeling solutions to help individuals, businesses, and society become more resilient to extreme events.

Specialty Business Solutions provides an integrated suite of software for full end-to-end management of insurance and reinsurance business, helping companies manage their businesses through efficiency, flexibility, and data governance

Marketing Solutions delivers data and insights to improve the reach, timing, relevance, and compliance of every consumer engagement

Life Insurance Solutions offers end-to-end, data insight-driven core capabilities for carriers, distribution, and direct customers across the entire policy lifecycle of life and annuities for both individual and group.

Verisk Maplecroft provides intelligence on sustainability, resilience, and ESG, helping people, business, and societies become stronger

At Verisk you can build an exciting career with meaningful work; create positive and lasting impact on business; and find the support, coaching, and training you need to advance your career.

We have received the Great Place to Work® Certification for the 7th consecutive year. We’ve been recognized by Forbes as a World’s Best Employer and a Best Employer for Women, testaments to our culture of engagement and the value we place on an inclusive and diverse workforce.

Verisk’s Statement on Racial Equity and Diversity supports our commitment to these values and affecting positive and lasting change in the communities where we live and work.

Verisk Analytics is an equal opportunity employer.

All members of the Verisk Analytics family of companies are equal opportunity employers. We consider all qualified applicants for employment without regard to race, religion, color, national origin, citizenship, sex, gender identity and / or expression, sexual orientation, veteran's status, age or disability.

Unsolicited resumes sent to Verisk, including unsolicited resumes sent to a Verisk business mailing address, fax machine or email address, or directly to Verisk employees, will be considered Verisk property.

Verisk will NOT pay a fee for any placement resulting from the receipt of an unsolicited resume.

HR CCPA Privacy

Full-time
APPLY

Data engineer

Lorven Technologies, Inc. New York, NY
APPLY

Our client is looking Data Engineer for Long Term project in Bloomfield CT, New York NY, Austin TX, Chicago IL (Initial Remote) below is the detailed requirements.

Job Title : Data Engineer

Duration : Long Term W2 Tax Term

Job description :

  • Bachelor's degree in Computer science or equivalent, with minimum 9+ years of relevant experience .
  • Must have experience with Pyspark, Python, Angular, SQL, Azure Databricks, Metadata.
  • Knowledge of at least one component : Azure Data Factory, Azure Data Lake, Azure SQL DW, Azure SQL
  • Expertise in ETL, API development, Microservices design and Cloud deployment solutions
  • Experience in RESTful APIs using message formats such as JSON and XML
  • Experience in integration technologies such as Kafka
  • Experience in Python and frameworks such Flask or Django
  • Experience in RDBMS and NoSQL databases
  • Good understanding of SQL, T-SQL and / or PL / SQL
  • Hands-on experience developing applications on AWS and / or Openshift
  • Automation Skills using Infrastructure as Code
  • Familiarity with creating web applications using AngularJS or React
  • Familiarity with creating benchmark tests, designing for scalability and performance, and designing / integrating large-scale systems.
  • Familiarity with building cloud native applications, knowledge on cloud tools such Kubernetes and Docker containers
  • Demonstrate excellent communication skills including the ability to effectively communicate with internal and external customers.
  • Ability to use strong industry knowledge to relate to customer needs and dissolve customer concerns and high level of focus and attention to detail.
  • Strong work ethic with good time management with ability to work with diverse teams and lead meetings.
Full-time
APPLY

Data Engineer

LMI New York, NY
APPLY

Overview

LMI is a consultancy dedicated to powering a future-ready, high-performing government, drawing from expertise in digital and analytic solutions, logistics, and management advisory services.

We deliver integrated capabilities that incorporate emerging technologies and are tailored to customers’ unique mission needs, backed by objective research and data analysis.

Founded in 1961 to help the Department of Defense resolve complex logistics management challenges, LMI continues to enable growth and transformation, enhance operational readiness and resiliency, and ensure mission success for federal civilian and defense agencies.

This position is remote but may require travel to a client site in Washington, DC (Georgetown)*

Responsibilities

As a Data Engineer you will help develop and deploy technical solutions to solve our customers’ hardest problems, using various platforms to integrate data, transform insights, and build first-class applications for operational decisions.

You will leverage everything around you : core customer products, open source technologies (e.g. GHE), and anything you and your team can build to drive real impact.

In this role, you work with customers around the globe, where you gain rare insight into the world’s most important industries and institutions.

Each mission presents different challenges, from the regulatory environment to the nature of the data to the user population.

You will work to accommodate all aspects of an environment to drive real technical outcomes for our customers.

Core Responsibilities

  • Setup transfers of data feeds from source systems into location accessible to Foundry and integrate with existing data utilizing enterprise architecture best practices
  • Debug issues related to delayed or missing data feeds
  • Monitor build progress and debug build problems in conjunction with deployment teams
  • Using Foundry’s application development framework to design applications that address operational questions
  • Rapid development and iteration cycles with SME’s including testing and troubleshooting application issues
  • Executing requests for information (RFI’s) surrounding the platform’s data footprint

Qualifications

  • Bachelor’s degree in data science, mathematics, statistics, economics, computer science, engineering, or a related business or quantitative discipline (Master’s degree preferred)
  • Preferred : Interim or Active DoD Secret clearance.
  • Strong engineering background, preferably in fields such as Computer Science, Mathematics, Software Engineering, Physics, or Data Science.
  • Proficiency with programming languages such as Python (Pyspark, Pandas) SQL, R, JavaScript, or similar languages.
  • Working knowledge of databases and SQL; preferred qualifications include linking analytic and data visualization products to database connections
  • At least 9 years of experience in the field
  • Ability to work effectively in teams of technical and non-technical individuals.
  • Skill and comfort working in a rapidly changing environment with dynamic objectives and iteration with users.
  • Demonstrated ability to continuously learn, work independently, and make decisions with minimal supervision.
  • Proven track-record of strong customer communications including feedback gathering, execution updates, and troubleshooting.

LI-SH1

Options

Sorry the Share function is not working properly at this moment. Please refresh the page and try again later. Share on your newsfeed

LMI is an Equal Opportunity Employer. LMI is committed to the fair treatment of all and to our policy of providing applicants and employees with equal employment opportunities.

LMI recruits, hires, trains, and promotes people without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, pregnancy, disability, age, protected veteran status, citizenship status, genetic information, or any other characteristic protected by applicable federal, state, or local law.

If you are a person with a disability needing assistance with the application process, please contact

Need help finding the right job?

We can recommend jobs specifically for you!

Software Powered by iCIMS

Temporary
APPLY

Senior data engineer

Fitch Ratings New York, NY
APPLY

At Fitch, we have an open culture where employees are able to exchange ideas and perspectives, throughout the organization, irrespective of their seniority.

Your voice will be heard allowing you to have a real impact. We embrace diversity and appreciate authenticity encouraging an environment where employees can be their true selves.

Our inclusive and progressive approach helps us to keep a balanced perspective. Fitch is also committed to supporting its employees by advancing conversations around diversity, equity and inclusion.

Fitch’s Employee Resource Groups (ERGs) have been established by employees who have joined together as a workplace community based on similar backgrounds or life experiences.

Fitch’s ERGs are available to connect employees with others within the organization to offer professional and personal support.

With our expertise, we are not only creating data and information, but also producing timely insights from every angle to influence decision making in this ever changing and highly competitive market.

We have a relentless hunger to innovate and unlock the power of human insights and to drive value for our customers. There has never been a better time to make an impact and we invite you to join us on this journey.

Fitch Ratings is a leading provider of credit ratings, commentary and research. Dedicated to providing value beyond the rating through independent and prospective credit opinions, Fitch Ratings offers global perspectives shaped by strong local market experience and credit market expertise.

The additional context, perspective and insights we provide have helped fund a century of growth and enables you to make important credit judgments with confidence.

At Fitch, we have an open culture where employees are able to exchange ideas and perspectives, throughout the organization, irrespective of their seniority.

Your voice will be heard allowing you to have a real impact. We embrace diversity and appreciate authenticity, employees work in an environment where they can be their true selves.

Our inclusive and progressive approach helps us to keep a balanced perspective.

With our expertise, we are not only creating data and information, but also producing timely insights from every angle to influence decision making in this everchanging and highly competitive market.

We have a relentless hunger to innovate and unlock the power of human insights and to drive value for our customers. There has never been a better time to make an impact and we invite you to join us on this journey.

Fitch is seeking a strong Data Engineer to improve critical data systems used widely by internal and external stakeholders.

The ideal candidate is someone who :

  • Has 5+ years of data engineering experience developing large data pipelines
  • Has Strong experience developing in Python.& Java
  • Has Strong Experience with relational SQL and NoSQL databases

Roles & Responsibility

  • Build data pipelines and applications to stream and process datasets at low latencies.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, NoSQL, Kafka using AWS Big Data technologies.
  • Collaborate with Data Product Managers, Data Architects, and other Data Engineers to design, implement, and deliver successful data solutions.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.
  • Track data lineage, ensure data quality and improve discoverability of data.
  • Work in Agile Environment (Scrum) and interact with multi-functional teams (Product Owners, Scrum Masters, Developers, Designers, Data Analysts)

Required Skills

  • Strong experience developing in Python & Java.
  • 5+ years of data engineering experience developing large data pipelines
  • Strong SQL and NoSQL skills and ability to create queries to extract data and build performant datasets.
  • Hands-on experience with message queuing and stream data processing (Kafka Streams).

Desirable Skills

  • Experience with relational SQL and NoSQL databases, any RDBMS (Oracle, Postgres) and NoSQL (Cassandra, Mongo, or Redis, etc.).
  • Hands-on experience with distributed systems such as Spark, Hadoop (HDFS, Hive, Presto, PySpark) to query and process data.
  • Strong analytic skills related to working with unstructured datasets.
  • Hands-on experience in using AWS cloud services : EC2, Lambda, S3, Athena, Glue, and EMR
  • Redshift / Snowflake
  • Experience in the Financial Services industry

Person specification

  • Excellent problem solving and analytical skills
  • Highly motivated to deliver results and meet deadlines

LI-CF1

DICE

Full-time
APPLY

Data Engineer

Fourier Ltd New York, NY
APPLY

Posted byPython US RecruiterFourier has partnered with several World Leading Hedge Funds, Prop Traders, and Market Makers in a search for elite and eager Data Engineers to join them.

Our clients are looking for the best data engineers in the industry with a proven track record of delivering scalable and robust data systems and are driven by solving the seemingly unsolvable problems.

They are looking for individuals who are driven and motivated but most importantly - excited by Data!

Do you love working with Python and have experience managing ETL pipelines, building a scalable distributed data platform or deriving insights from alternative data sets?

Do you have an affinity for learning new technologies and always looking to broaden your existing technical knowledge?

These clients are at the pinnacle of finance, and therefore leadingpensation packages should be expected.

Primary Tech Stack :
  • Python
  • Python
  • Data storage and manipulation tools such as SQL, Pandas, NumPy.
  • Various ETL / ELT Technologies.
  • Full-time
    APPLY