consulting full time tech

Job Details

Introduction

At IBM, work is more than a job - it's a calling: To build. To design. To code. To consult. To think along with clients and sell. To make markets. To invent. To collaborate. Not just to do something better, but to attempt things you've never thought possible. Are you ready to lead in this new era of technology and solve some of the world's most challenging problems? If so, lets talk.

Your Role and Responsibilities

Data Engineer Consultant will be responsible for interacting with clients and help deliver components of client engagements that identify, design, and implement Data Platform solutions on Hybrid Cloud. Has deep experience in Data Architecture, Data Modelling, DataOps, and DevSecOps, Data Governance. Key responsibilities will include:

  • Design, Develop, maintain and enhance data engineering solutions on Cloud Platforms

  • Migrating mission-critical database applications from on-premise to cloud or running a hybrid cloud environment for large enterprises.

... Design highly scalable data engineering solutions on any of the cloud platforms - Amazon Web Services, Azure and Google Cloud Platform

  • Leverage data engineering expertise to develop the strategic and tactical foundation for data-driven client challenges.

  • Work to achieve client goals at the intersection of marketing/customer experience and data/analytics/automation/AI.

  • Perform business/data analysis to investigate into various business problems and propose the solution working closely with clients and team

  • Get involved in business development activities like creating proof of concepts (POCs), point of views

Day-to-day, your role includes:

  • Being on a team responsible for executing end-to-end tech solutions that solve clients ' most challenging problems

  • Interfacing with client's technical and business teams to gather business requirements to define technical requirements for customer experience and marketing data solutions

  • Design and build data, AI, and analytics solutions using industry standard technologies

  • Collaborate with teammates, including data architects and data scientists to develop optimized solutions

  • Leading the best practices for unit testing, CI/CD, performance testing, capacity planning, documentation, monitoring, alerting and incident response

  • Working cross-functionally with other teams to solve deep

Required Technical and Professional Expertise

  • A four-year college degree in Computer Science, Engineering, or Technical Science or equivalent

  • 4+ years of IT development experience

  • Certified in any of the cloud platforms i.e., AWS, Google, IBM, Azure

  • Experience of Big Data technologies and solutions (Spark, Hadoop, Hive, MapReduce) and multiple scripting and languages (YAML, Python).

  • Experience building and coding applications using Spark, Kafka, Storm, HDFS, Hbase, Hive, Sqoop etc.

  • 3+ years of experience in cloud architecture building real-time data pipelines and scaling data transformations

  • Experience of Big Data technologies and solutions (Spark, Hadoop, Hive, MapReduce) and multiple scripting and languages (YAML, Python).

  • Experience building and coding applications using Spark, Kafka, Storm, HDFS, Hbase, Hive, Sqoop etc.

  • 3+ years of experience in working in CICD/DevOps methodology

  • 5-7 years of experience with data processing (ETL) tools and methodologies (Informatica, Talend, Spark, Pentaho, SSIS, Unifi, Snaplogic)

  • Knowledge of Data Architecture, Data Modelling and adopting best practices and policies in the Data Management space

  • Experience in databases, data warehousing and high-performance computing environments.

  • Experience with scripting, automation and SQL/PL-SQL and high-level languages such as Python and Java.

  • Ability to interface with nontechnical end users and translate business requests into technical requests, comfortable communicating with stakeholders to define requirements

  • Experience with business requirements definition and management, structured analysis, process design, use case documentation

  • Experience prepping data/data modeling for use in viz tools like Tableau, Power BI, Qlik

  • Strong collaboration skills

  • The ability to excel in a fast-paced environment

  • Expertise across one or more capabilities: Data visualization, reporting, and dashboarding experience is a plus

Preferred Technical and Professional Expertise

  • Mastersdegree in Data Sciences is preferred

  • 2-3 years of experience with cloud platforms such as Azure, AWS, GCP preferred

  • Machine learning experience is a plus

  • 5+ years of experience with data quality analysis, statistical analysis and/or modeling, etc. preferred

  • 2 years of experience mining and analyzing transactional electronic data with competencies in one of more of the following: SAS, SQL, SPSS, R, Python, ACP, IDEA Tableau, MATLAB, Visual Basic, BI, Hadoop, or Alteryz preferred

See something wrong with this listing?

Contact support