Rocksolid Light

Welcome to RetroBBS

mail  files  register  newsreader  groups  login

Message-ID:  

<sangr> home is where the highest bandwidth is


devel / comp.lang.javascript / Need AWS Data Engineer //C2C//Denver, CO & Dallas, //Onsite

SubjectAuthor
o Need AWS Data Engineer //C2C//Denver, CO & Dallas, //Onsiteashwini pokalkar

1
Need AWS Data Engineer //C2C//Denver, CO & Dallas, //Onsite

<0ed525f1-066c-4009-8b65-3107c2f2799fn@googlegroups.com>

  copy mid

https://www.rocksolidbbs.com/devel/article-flat.php?id=17655&group=comp.lang.javascript#17655

  copy link   Newsgroups: comp.lang.javascript
X-Received: by 2002:a05:6214:6a5:: with SMTP id s5mr14871qvz.0.1644944265487;
Tue, 15 Feb 2022 08:57:45 -0800 (PST)
X-Received: by 2002:a05:6870:822b:: with SMTP id n43mr1777795oae.271.1644944265193;
Tue, 15 Feb 2022 08:57:45 -0800 (PST)
Path: i2pn2.org!i2pn.org!weretis.net!feeder8.news.weretis.net!3.eu.feeder.erje.net!feeder.erje.net!border1.nntp.dca1.giganews.com!nntp.giganews.com!news-out.google.com!nntp.google.com!postnews.google.com!google-groups.googlegroups.com!not-for-mail
Newsgroups: comp.lang.javascript
Date: Tue, 15 Feb 2022 08:57:44 -0800 (PST)
Injection-Info: google-groups.googlegroups.com; posting-host=2607:6b80:6:b000:0:0:0:5719;
posting-account=6HQDIQoAAAD_N9O_SzIlpRXjIANv-aeQ
NNTP-Posting-Host: 2607:6b80:6:b000:0:0:0:5719
User-Agent: G2/1.0
MIME-Version: 1.0
Message-ID: <0ed525f1-066c-4009-8b65-3107c2f2799fn@googlegroups.com>
Subject: Need AWS Data Engineer //C2C//Denver, CO & Dallas, //Onsite
From: ashwinipokalkar05@gmail.com (ashwini pokalkar)
Injection-Date: Tue, 15 Feb 2022 16:57:45 +0000
Content-Type: text/plain; charset="UTF-8"
Content-Transfer-Encoding: quoted-printable
Lines: 159
 by: ashwini pokalkar - Tue, 15 Feb 2022 16:57 UTC

Hello Associates,

Hope you are doing great. We have multiple requirements.
Please go through the below requirement and revert me with an updated resume with expected pay rate.
Position: AWS Data Engineer

Location: Denver, CO & Dallas, TX (Day 1 –Onsite)

Description:

7 to 9 years of experience in Data Engineering
Strong experience in Database technologies (Relational, NoSQL and Cloud)
Experience in modelling Data Warehousing, Data Mart and Data Lakes (on Cloud)
Expert in designing and implementing Data Pipelines, ETL/ELT
Experience in AWS Cloud and its services
Experience in Python or Java
Understand data usage and implications for data migration
Must have worked in Data Migration project(s) and understand different migration modules
Should be able to write rule based transformations, DQ rules and Data Analysis rules
Good at writing data migration, cleansing, conversion, validation procedures and perform data migration into production
Should be good at ETL/Data integration/Analytical tools.

Title- Data Analyst

Location- Denver, CO (First Day Onsite), Dallas TX

Description:

Around 6 to 7 years of experience in demonstrating technical client service in the Data Analysis space.

Strong mathematical skills to help collect, measure, organize and analyze data.

Technical proficiency regarding database design development, data models, techniques for data mining, and segmentation.

Proficiency in programming/scripting languages including SQL / No SQL

2 to 3 years of strong experience in Working with AWS native services will be an added advantage

Proven analysis and problem-solving skills

Experience in building Source to target mapping sheets in discussion with source SME's and Business and working with data modelers and Data Engineers to build & validate the E2E data flow

Experience in providing E2E / UAT testing support

Ability to learn new tools and assimilate knowledge quickly.

Title : Devops AWS

Work location : Littleton (Denver), CO & Dallas, TX(Onsite)

Job description:

DevOps Engineer with strong experience in AWS administration, Kubernetes Cluster Management, CI/CD pipelines management, Build & Release management, Infrastructure Automation.

MAJOR DUTIES AND RESPONSIBILITIES

AWS infrastructure administration and management (VPC, EC2, S3, ELB, EBS, Route53, ASM etc)

Kubernetes Cluster Management including creating new kops clusters & building / deploying Secrets, configs and Docker based Containerized Microservices etc.

Monitoring & Alerting on the Infrastructure resources and application availability using APM & other monitoring tools.

Contribute to Infrastructure Automations using Infrastructure as a code (Terraform).

CI/CD Pipelines management in Gitlab.

Troubleshoot and Resolve Support Tickets.

Required Qualifications

Skills/Abilities and Knowledge

Ability to read, write, speak, and understand English

Knowledge of Voice, Data, Video, Wireless technologies

Strong AWS / Kubernetes knowledge.

Experience/familiarity build / deploying containerized micro-services

Experience in implementing CI/CD pipelines

Experinece with Networking concepts

Is a quick learner, who is able to acquire knowledge and skills needed for the job on short order.

Willingness to collaborate and work in a team environment.

Education

BA/BS in Information Technology, Computer Science, related field or equivalent work experience.

Related Work Experience

10 plus years’ experience as DevOps Engineer.

4 plus years’ experience in AWS Administration.

3 plus years’ experience with Kubernetes Cluster Management.

4 plus years’ experience developing CI/CD toolkits and pipelines

4 plus years’ experience in writing Shell Scripts (Python/Linux)

Experience in Harness/Gitlab is preferable

Experience in installing / maintaining NoSQL databases like Cassandra, MongoDB and AWS managed NoSQL and RDS Databases.

Familiarity with service mesh

Thanks & Regards

Ashwini P

Amiga Informatics Inc.

83 Woodbury Rd, Hicksville, NY 11801

Direct: +1 516-758-0982| Ext: 173

Email: ashw...@amigainformatics.com

USA | Canada | India


devel / comp.lang.javascript / Need AWS Data Engineer //C2C//Denver, CO & Dallas, //Onsite

1
server_pubkey.txt

rocksolid light 0.9.81
clearnet tor