Big Data

Data Engineering – Hadoop/Big Data


Data Engineering – Hadoop/Big Data

As a member of our Software Engineering Group we look first and foremost for people who are passionate around solving
business problems through innovation & engineering practices. You will be required to apply your depth of knowledge
and expertise to all aspects of the software development lifecycle, as well as partner continuously with your many
stakeholders on a daily basis to stay focused on common goals. We embrace a culture of experimentation and constantly
strive for improvement and learning. You’ll work in a collaborative, trusting, thought-provoking environment-one that
encourages diversity of thought and creative solutions that are in the best interests of our customers globally.
Commercial Banking IT is looking for a Big Data Software Engineer with skills and experience with large-scale Hadoopbased
data platforms who will be responsible for design, development and testing of a next generation enterprise data hub
and reporting and analytic applications. This individual will work with an existing development team to create the new
Hadoop-based platform and migrate the existing data platforms and provide production support. The current platform
uses many tools including Oracle SQL, SQL Server, SSIS, and SSRS/SSAS. The candidate will be accountable for
design, development, implementation and post-implementation maintenance and support. The candidate will develop and
test new interfaces, enhancements/changes to existing interfaces, new data structures, and new reporting capabilities.
Responsibilities:
– Acquire data from primary or secondary data sources
– Identify, analyze, and interpret trends or patterns in complex data sets
– Transforming existing ETL logic into Hadoop Platform
– Innovate new ways of managing, transforming and validating data
– Establish and enforce guidelines to ensure consistency, quality and completeness of data assets
– Apply quality assurance best practices to all work products
– Analyze, design and code business-related solutions, as well as core architectural changes, using an Agile programming
approach resulting in software delivered on time and in budget;
– Experience of working in a development teams, using agile techniques and Object Oriented development and scripting
languages, is preferred
– Comfortable learning cutting edge technologies and applications to greenfield project

Qualifications – Internal
This role requires a wide variety of strengths and capabilities, including:
– BS/BA degree or equivalent experience
– Advanced knowledge of application, data and infrastructure architecture disciplines
– 5+ Experience in a Big Data technology (Hadoop, YARN, Sqoop, Spark SQL, Nifi, Hive, Impala, Kafka etc.)
– 5+ years of Experience in Java Development
– 3+ years of Experience with Python is preferred
– Experience performing data analytics on Hadoop-based platforms is preferred
– Experience in writing SQL’s
– Experience in implementing complex ETL transformations in Hadoop platform
– Strong Experience with UNIX shell scripting to automate file preparation and database loads
– Experience with ETL tools
– Experience in data quality testing; adept at writing test cases and scripts, presenting and resolving data issues
– Familiarity with relational database environment (Oracle, SQL Server, etc.) leveraging databases, tables/views, stored
procedures, agent jobs, etc.
– Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information
with attention to detail and accuracy
– Strong development discipline and adherence to best practices and standards.
– Ability to manage multiple priorities and projects coupled with the flexibility to quickly adapt to ever-evolving business
needs
– Demonstrated independent problem solving skills and ability to develop solutions to complex analytical/data-driven
problems
– Must be able to communicate complex issues in a crisp and concise fashion to multiple levels of management
– Excellent interpersonal skills necessary to work effectively with colleagues at various levels of the organization and
across multiple locations
– Familiarity with NoSQL database platforms is a plus
– Proficiency across the full range of database and business intelligence tools; publishing and presenting information in an
engaging way is a plus
– Experience with data management process in AWS is a huge Plus
– Experience with multiple reporting tools (QlikView/QlikSense, Tableau, SSRS, SSAS, Cognos) is a plus
– Experience in Data Science, Machine Learning and AI is a plus
– Financial Services and Commercial banking experience is a plus



Source link

Most Popular

To Top