Bread Financial, a prominent entity in the financial sector, is currently seeking candidates for the position of Data Engineer 1, stationed in Karnataka. This role is open to individuals holding a Bachelor’s Degree in Computer Science or Engineering, providing an excellent opportunity for fresh graduates to venture into the field of data engineering. With an experience requirement ranging from 0 to 3 years, Bread Financial values both entry-level professionals and those with a bit of experience under their belt. As a Data Engineer 1 at Bread Financial in Karnataka, candidates will be involved in various aspects of data management and engineering, contributing to the company’s data-driven initiatives.
Company Name: Bread Financial
Job Role: Data Engineer 1
Education Required: Bachelor’s Degree in Computer Science or Engineering
Experience Required: 0 to 3 years
Job Location: Karnataka
Role and Responsibilities:
- Collaboration – Collaborates with internal/external stakeholders to manage data logistics – including data specifications, transfers, structures, and rules.
- Collaborates with business users, business analysts and technical architects in transforming business requirements into analytical workbenches, tools and dashboards reflecting usability best practices and current design trends.
- Demonstrates analytical, interpersonal and professional communication skills. Learns quickly and works effectively individually and as part of a team.
- Process Improvement – Access, extract, and transform Credit and Retail data from a variety of sources of all sizes (including client marketing databases, 2nd and 3rd party data) using Hadoop, Spark, SQL, Big data technologies etc.
- Provides automation help to analytical teams around data centric needs using orchestration tools, SQL and possibly other big data/cloud solutions for efficiency improvement.
- Project Support– Support Sr. Specialist and Specialist in new analytical proof of concepts and tool exploration projects.
- Effectively manage time and resources in order to deliver on time/correctly on concurrent projects.
- Involved in creating POCs to ingest and process streaming data using Spark and HDFS.
- Data and Analytics – Answer and trouble shoot questions about data sets and analytical tools; Develop, maintain and enhance new and existing analytics tools to support internal customers.
- Ingest data from files, streams and databases then process the data with Python and Pyspark in order to store data to Hive or NoSQL database.
- Manage data coming from different sources and involved in HDFS maintenance and loading of structured and unstructured data.
- Apply knowledge in Agile Scrum methodology that leverages the Client Bigdata platform and used version control tool Git. Import and export data using Sqoop from HDFS to RDBMS and vice-versa.
- Demonstrate an understanding of Hadoop Architecture and underlying Hadoop framework including Storage Management.
- Create POCs to ingest and process streaming data using Spark and HDFS.
- Work on back-end using Scala, Python, and Spark to perform several aggregation logics.
Required Skills and Qualifications:
- Expert in writing complicated SQL Queries and database analysis for good performance.
- Experience in working on Microsoft Azure Services like ADLS/Blob Storage solutions, Azure DataFactory, Azure Functions and Databricks.
- Utilize basic knowledge of Rest API for designing networked applications.
- Bachelor’s Degree in Computer Science or Engineering.
- 0 to 3 years in Data & Analytics.