ROLE: Data Engineer
Location: Dallas, TX, & Atlanta, GA (any of two locations can be given), Day1 onsite
3 days work from office, Hybrid (as of now)
For Subcon $65/hrr on C2C
Must : Strong sql database (oracle,postgres or sql server) experience,ETL (Hands on Python) experience,Kafka ,pyspark, azure cloud experience(eventhub ,ADF,databricks,delta lake experience), DevOps/CI-CD (GitHub Actions, Azure DevOps) & Flink , Adobe Experience Platform (AEP) XDM
schema modeling
|
Skill Area |
Required Skills (Must‑Have) |
Nice-to-Have Skills |
|
Data Governance |
• Understanding of telecom regulatory rules (PII, CPNI, DPI, GDPR, SOX) • Data cataloging tools (Purview, Collibra, Informatica ) • Data classification & metadata management • Data ownership, stewardship frameworks
|
• DAMA‑DMBoK framework knowledge •Adobe XDM Governance Mapping • Adobe Real-Time CDP governance integration • Experience with AI‑driven governance automation • Governance workflows (ira) • Data retention policy automation |
|
Data Modeling and Sourcing |
• Strong conceptual, logical, physical modeling skills • Dimensional modeling (Star/Snowflake) • Normalization/denormalization strategies • Relational modeling for RDBMS (Oracle, SQL Server, PostgreSQL) • SQL proficiency for sourcing, profiling, validation • Data quality profiling on ingested feeds • Schema evolution automation for streaming data |
• Knowledge of TM Forum SID / eTOM models • Data Vault modeling • Understanding of telecom data flows & domains (orders, billing ,charging, usage, SIM, provisioning, network events) • Adobe Experience Platform (AEP) XDM schema modeling
|
|
Cloud & Storage Platforms |
• Cloud data storage: ADLS • Access control management (RBAC, ACLs, Key Vault, IAM) • Understanding of storage zone patterns (Raw, Curated, Processed)
|
• Lakehouse technologies (Delta Lake, Iceberg, Hudi) • Multi-cloud governance • Data ingestion tools (ADF, Nifi)
|
|
Data Engineering (General) |
• ETL/ELT concepts (mapping, transformations, business rules) • Python/SQL at intermediate level • Batch and streaming processing basics • Familiarity with version control (Git)
|
• PySpark/Scala • dbt core modeling and testing • Experience with Databricks jobs, pipelines • DevOps/CI-CD (GitHub Actions, Azure DevOps) |
|
Streaming & Real-Time Data |
• Basics of Kafka or Event Hubs • Understanding real-time vs batch architectural differences
|
• Flink/Spark Streaming
|
|
Security, Privacy & Compliance |
• Data classification & policy enforcement • PII and sensitive telecom data handling • Encryption, masking, data access governance |
|
|
Soft Skills & Governance Playbook Execution |
• Stakeholder alignment (Business, Marketing, IT) • Ability to translate business rules into data rules • Data stewardship & operational governance |
• Change management • Workflow automation for governance processes
|
Thanks & Regards,
Irfan Shaik
P : 972-440-0069
Agile Enterprise
Solutions Inc.
2591 Dallas Parkway,Suite 300, Frisco,TX 75034.
Email: irfan...@aesincus.com Website: www.aesinc.us.com