Looking for Mid-Level Devops Engineer and Senior Data Engineer

2 views
Skip to first unread message

Rahul Teja

unread,
Aug 25, 2020, 12:42:11 PM8/25/20
to Resumes
Hello Partners,
Please check the following  (2) requirements and share your candidate resumes to email: ra...@lor-venk.com / ph- 919-689-5606. or 8045541121.

1)Role: Devops Engineer
Location: Richmond VA
Client: Capital One
Duration: Longterm
Any visa is accepted
Minimum Experience: 5+ years
Rate: $45/ hr on C2C

Basic Qualifications:
  • Bachelor’s Degree or military experience
  • At least 5 years of Software Development Life Cycle (SDLC) experience
  • At least 3 years of experience in Build and CI/CD/CT technologies
  • At least 3 years of experience in Linux Shell Scripting

Preferred Qualifications
  • 2+ years of experience with Groovy, Python or Ruby
  • 2+ years of experience in developing RESTful APIs using Spring and Jersey
  • 1+ years experience with Kafka, Spark, or Grafana
  • Certified in Java, or Spring, or AWS.

2) Role: Sr. Data Engineer / Big Data / Hadoop Developer.
Location: Richmond VA or McLean VA 
Client: Capital One
Duration: Long term
Any Visa is Accepted
Minimum Experience: 8+ Years (Please share resumes with minimum  8 years Experience).
Rate: Negotiable as per experience

Responsibilities of the role: 
  • Build data pipeline frameworks to automate high-volume and real-time data delivery to our cloud platform
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies
  • Develop and enhance applications using a modern technology stack such as Java, Python, Shell Scripting, Scala, Postgres, Angular JS, React, and Cloud based data warehousing services such as Snowflake.
  • Perform unit tests and conduct reviews with other team members to make sure your code is rigorously designed, elegantly coded, and effectively tuned for performance.
Required Experience:
  • 5+ years of experience building data pipelines and using ETL tools to solve complex business problems in an Agile environment
  • 5+ years of experience in at least one scripting language (SQL, Python, Perl, JavaScript, Shell) 
  • 3+ year of experience using relational database systems (Snowflake, PostgreSQL, or MySQL) 
  • 3+ year experience working on streaming data applications (Spark Streaming, Kafka, Kinesis, and Flink)
  • 3+ years of experience in big data technologies (MapReduce, Cassandra, Accumulo, HBase, Spark, Hadoop, HDFS, AVRO, MongoDB, or Zookeeper).
  • 2+ years of experience with Amazon Web Services (AWS).

Best Regards,

Reply all
Reply to author
Forward
0 new messages