Data Scientist with Python-Remote

0 views
Skip to first unread message

David Miller

unread,
Nov 24, 2025, 1:40:34 PMNov 24
to

Role: Data Scientist with Python

Location: Remote

Duration: Long Term

Visa: Any – OPT is also fine

 

Position Overview
We are seeking a highly skilled and hands-on Data Scientist with strong Python expertise to support ongoing analytics, automation, and modeling initiatives. The ideal candidate is someone who enjoys solving complex problems using data, building scalable analytical solutions, and supporting business decision-making with meaningful insights.
This is a part-time contract role to begin (20–30 hours/week), with the strong possibility of transitioning into a long-term engagement based on performance and project pipeline.

Key Responsibilities

Develop robust analytical models, machine learning pipelines, and data-driven insights.
Write clean, scalable, and production-quality Python code.
Analyze large datasets using statistical techniques, ML frameworks, and predictive analytics.
Build, enhance, and validate ML models for forecasting, classification, and optimization.
Automate workflows, reporting, and data processing scripts.
Extract, clean, transform, and manipulate data from multiple systems and formats.
Present findings clearly to technical and non-technical stakeholders.
Collaborate with engineering, operations, and business teams for ongoing project alignment.

Required Skills & Experience
3+ years of proven hands-on experience as a Data Scientist or Machine Learning Engineer.
Expert-level proficiency in Python, including:
Pandas, NumPy
Scikit-Learn
Matplotlib / Seaborn / Plotly
Jupyter Notebook

Strong understanding of:
Statistical modeling
Predictive analytics
Machine learning algorithms (classification, regression, clustering, NLP, etc.)
Experience working with SQL, relational databases, and ETL workflows.
Solid understanding of version control (Git/GitHub).

Preferred Skills (Nice to Have)
Experience with cloud platforms such as AWS, GCP, or Azure.
Familiarity with AI/LLMs, data stitching, and automation frameworks.

Exposure to tools such as:
Databricks
Snowflake
Apache Airflow
Experience deploying or operationalizing models in production environments.

Reply all
Reply to author
Forward
0 new messages