Passionate About Data: You are an expert in SQL and have a passion for the ‘modern data stack’ (ie: ingest and transformation tools, cloud-based DWH, and BI).
You have 5+ years of experience in a hands-on data role, with some experience as a data owner for a department (Python experience a plus)
5+ years of experience Designing and implementing ETL solution using Oracle BI APPS for Oracle ERP
5+ Years of experience in Informatica with BI APPS
Ability to independently work on BI projects/tickets and design & develop BI solution
Solid understanding of GL, AP and AR data models and implementation experience of additional OBIA such as P&S Analytics, Spend Classification and SCM analytics is plus
Hands on experience in Data Integration and solid understanding of GL, AP and AR data models in EBS is required
Hands on experience in Data Modeling and OBIA star schema
Good understanding on DAC, Informatica ETL architecture and OBIEE pre build data model
Collaborative: You possess the ability to partner smoothly with engineering teams to design, implement and test data infrastructure, and with business users to support data-driven outcomes
Experience transforming data in modeling tools such as dbt, Dataform, and working with a cloud-based data warehouse (BigQuery/GCP experience a plus)
You possess a deep curiosity and tenacity to understand data and business questions
What You’ll Do:
Work with the data engineering team to properly identify and ingest critical data from internal and
external sources
Partner with multiple internal business organizations to ensure tracking, pipelines and modeling meet
business needs and technical best practices
Create highly readable and auditable models for use in analytics/reporting, product and business
operations
Contribute to creating a data centric organization by onboarding and training business users to our data analysis and visualization tools
Build mission critical analyses and reporting which are central and pertinent to senior leadership and
majority of the company (new users, active users, platform utilization etc.)
Be a key partner in building a high-performing data org by sharing knowledge, creating dynamic
processes and creating documentation that helps us scale