Senior Data Engineer - 6 month rate contract€500 per day - Dublin (remote)Key Responsibilities· Maintain/develop data pipelines required for the extraction, transformation, cleaning, pre-processing, aggregation and loading of data from a wide variety of data sources using Python, SQL, DBT, and other data technologies· Design, implement, test and maintain data pipelines/ new features based on stakeholders' requirements· Develop/maintain scalable, available, quality assured analytical building blocks/datasets by close coordination with data analysts· Optimize/ maintain workflows/ scripts on present data warehouses and present ETL· Design / develop / maintain components of data processing frameworks· Build and maintain data quality and durability tracking mechanisms to provide visibility into and address inevitable changes in data ingestion, processing, and storage· Collaborate with stakeholders to define data requirements and objectives.· Translate technical designs into business appropriate representations and analyse business needs and requirements ensuring implementation of data services directly correlates to the strategy and growth of the business· Address questions from downstream data consumers through appropriate channels· Create data tools for analytics and BI teams that assist them in building and optimizing our product into an innovative industry leader· Stay up to date with data engineering best practices, patterns, evaluate and analyze new technologies, capabilities, open-source software in context of our data strategy to ensure we are adapting our own core technologies to stay ahead of the industry· Contribute to Analytics engineering processRequired Qualifications5+ Years Relevant Work ExperienceBA / BS in Data Science, Computer Science, Statistics, Mathematics, or a related fieldBuilt processes supporting data transformation, data structures, metadata, dependency, data quality, and workload managementExperience with Snowflake, Hands-on experience with Snowflake utilities, Snow SQL, Snow Pipe. Must have worked on Snowflake Cost optimization scenarios.Overall solid programming skills, able to write modular, maintainable code, preferably Python & SQLHave experience with workflow management solutions like AirflowHave experience on Data transformations tools like DBTExperience working with GitExperience working with big data environment, like, Hive, Spark and PrestoReady to work flexible hoursPreferred RequirementsExperience supporting Support, Customer Success,DAG AirflowsKnowledge of natural language processing (NLP) and computer vision techniques.Familiarity with version control systems (e.g., Git).SnowflakeDBTWorking knowledge of Power BIAWS environment, for example S3, Lambda, Glue, Cloud watchBasic understanding of SalesforceExperience working with remote teams spread across multiple time-zonesHave a hunger to learn and the ability to operate in a self-guided mannerFor more info please contact Michael on 01 6146058 /#LI-MF7