Cloud Data Engineer
Boston Energy Trading & Marketing (BETM) has accelerated its shift to digital growth and cloud enablement. Our talented, energetic team is creating next-gen platforms to provide industry leading solutions supporting the green energy transition. We’re seeking candidates with the passion to enhance value through technology, and with the experience to effectively manage & mature the solutions we create. If you have those traits, and you are ready to join our Boston-based team in a hybrid work model, we would love to hear from you!
As part of this team, you will engage closely with business & IT colleagues to improve, streamline and automate business processes. You will design, build and manage applications/workflows in a cloud environment. You will leverage tools which automate processes, enabling our DevOps capabilities to manage all aspects of application development. You are organized, driven to solve problems and have a passion for life-long learning. Your strong engineering skills, along with your customer-focused mindset, makes you a valuable addition to our team.
Essential Duties & Responsibilities:
• Design, develop and maintain APIs, data pipelines & automation tools using Python, FastAPI and Azure Services.
• Write clean, efficient, and maintainable code that follows best practices. Exercise the full software development lifecycle in developing proper Python packages.
• Optimize application performance and scalability for meeting business needs.
• Identify opportunities for reusable code and create microservices and reusable components to increase team’s ability to deliver quality solutions quickly.
• Work directly with analysts to gain insight into existing data sets. Evaluate use cases for new data turning requests into actionable designs.
• Incorporate new data sources from external vendors using streams, flat files, APIs, and databases.
• Maintain and provide support for the existing data pipelines using Python, ADF, and SQL
• Plan & execute using agile methodologies, developing & delivering within predefined sprints
• Identify & deploy appropriate file formats for data ingestion into various storage and/or compute services via ADF for multiple use cases
• Develop and implement tests to ensure data quality across all integrated data sources
Required Experience and Skills:
• Bachelor’s or higher degree in Computer Science, Engineering or related field of study
• 4+ years of professional programming experience
• Strong experience in Python development, testing frameworks, workflow automation and cloud-based technologies
• Solid understanding of OOP and algorithms, data structures, and design patterns
• Familiarity with Snowflake or any other OLTP/OLAP databases
• Familiarity with Azure services including Logic Apps, Azure Functions, Azure Storage, SQL Database, SQL Managed Instance.
• Familiarity with DevOps tools and CI/CD pipelines, dependency management, database systems, and cloud solutions
• Passionate about technology and excel in cloud platforms, implementation, and troubleshooting
• Excellent communication and collaboration skills, and willing to learn new skills
• Supportive of a culture of continuous process and organizational improvement
Other details
- Job Family BETM
- Pay Type Salary
- Boston, MA, USA