Data Engineer Co-op

Full Time
Boston, MA, USA
7 months ago

 

Co-op Opportunity:

The Chewy Campus Recruiting Team is seeking a motivated Data Engineer Co-op to join our growing teams in our Boston, MA office location.

Internship Timeframe:  June 3rd – December 6th, 2024 (must be available for the full duration)

Qualified Students:  Undergraduate Rising Seniors or Master’s student

As part of the Data Engineering team at Chewy, you will have the opportunity to gain hands-on experience in the field of data engineering, working on various projects building data pipelines, ETL processes, and data warehouse management. The ideal candidate should have a strong interest in building and maintaining cloud databases, ingesting data using a variety of methods (including non-SQL technologies like SOAP and REST) and working on joining datasets from different cloud-based source systems in a centralized database.

What You'll Do:

  • Assist in the development and maintenance of data pipelines to extract, transform, and load (ETL) data from various sources into our data lake.
  • Configure custom data pipelines within Snowflake/AWS/Databricks for ingestion into Data Mart.
  • Design and implement solutions on a cloud platform utilizing Infrastructure as code (Terraform).
  • Assist in the development and maintenance of data pipelines to extract, transform, and load (ETL) data from various sources into our data warehouse.
  • Maintain, support, and develop within the Supply Chain - Transportation Data Mart Snowflake instance, including code build/review, auditing, performance tuning, and security.
  • Create and maintain technical documentation and models for the Data Mart.

What You'll Need:

  • Currently enrolled full-time in an accredited Bachelor’s or Master’s degree program with an anticipated graduation of Spring 2025. Programs should include Data Engineering, Data Analytics, Machine Learning, Mathematics, Engineering, or related discipline.
  • Excellent verbal and written communication skills and the ability to explain details of complex concepts to non-expert stakeholders in a simple and understandable way
  • Current permanent U.S. work authorization required.

 

Bonus:

  • Proficiency in coding and data analysis using Python, PySpark, Airflow, SQL, Snowflake.
  • Knowledge of AWS data toolset (Glue, Athena, EMR, EKS, etc.) and other data platform like Databricks.
  • Experience of translating ambiguous customer requirements into clear problem definitions and delivering them.
  • Proven experience in the design and execution of analytical projects.

Chewy is committed to equal opportunity. We value and embrace diversity and inclusion of all Team Members. If you have a disability under the Americans with Disabilities Act or similar law, and you need an accommodation during the application process or to perform these job requirements, or if you need a religious accommodation, please contact CAAR@chewy.com.

 

If you have a question regarding your application, please contact HR@chewy.com.

 

To access Chewy's Customer Privacy Policy, please click here. To access Chewy's California CPRA Job Applicant Privacy Policy, please click here.