B.S. in Computer Science or equivalent work experience
Experience with Amazon Web Services (AWS) technologies: Cloud Formation, EC2, S3, EMR, Autoscale, Cloudwatch
Knowledge of one of the container technologies (Docker/Kubernetes)
Experience with CI / CD using JIRA, Jenkins, Ansible
Strong in UNIX, Linux and Databases.
Experience with any scripting language such as BASH, Perl, Ruby or Python.
General understanding of Hadoop framework including MapReduce, HDFS, Hive, Sqoop, Flume, Pig, Oozie, Storm, Kafka, Falcon
Experience in designing, monitoring and supporting a large-scale & highly available systems.
Understanding of protocols/technologies like SOA, HTTP, SSL, LDAP, JDBC, SQL, HTML, XML
Experience providing production operations support and 24/7 support
Successful track record of providing production support for large-scale distributed systems, with experience in automation, production migration and deployment with minimal down time
Teamwork & collaboration skills to work across organizations and lead cross-functional teams.
Communication & stakeholder management skills.
Problem solving skills to develop quick and sound solutions to resolve complex issues.
M.S. in Computer Science or equivalent work experience
Hands-on experience in AWS and interfacing with AWS API
Hands-on experience in development of microservices and deploying in Docker / Kubernetes
Experience setting up monitoring using Wily
Strong in UNIX, Linux and Shell scripting, Perl, PHP, Javascript, JSON, Python
Experience with Hadoop administration including building clusters, operations and support
Strong in Cloud Foundry and Open Stack fundamentals
|