Designs, modifies, develops, and implements solutions for ingesting, provisioning and securing data in our Lake (S3), Warehouse (Redshift) and Marts (Redshift and MySQL) using Scala, Apache Spark, Hive, Apache Pig, MapReduce, AWS Data Pipeline, AWS Glue, AWS EMR and Apache Kafka as well as have experience with Informatica and Talend. Participates in the testing process through test review and analysis, test witnessing and certification of software. Requires 0-2 years of experience in the field or in a related area. Has knowledge of commonly-used concepts, practices, and procedures within a particular field. Relies on instructions and pre-established guidelines to perform the functions of the job.
See something wrong with this listing?
Contact support