We use cookies to improve the user experience, analyze traffic and display relevant ads.
Details Accept
Enter position

Overview of salaries statistics of the profession "Big Data Cloud Support Engineer in Canada"

Receive statistics information by mail
Unfortunately, there are no statistics for this request. Try changing your position or region.

Recommended vacancies

Engineer, Data
Aecon Group Inc., Toronto, ON
Come Build Your Career at Aecon! As a Canadian leader in infrastructure development, Aecon is safely and sustainably building what matters for future generations to thrive! We lead some of the most impactful infrastructure projects of our generation, at the forefront of transformational change in transportation and energy, and partnering every day to build, connect, power, and strengthen our communities. At Aecon, you can count on: Safety First. Our number one core value. If we cant do it safely, we dont do it at all. Integrity. We lead by example, with humility and courage. Accountability. Were passionate about delivering on our commitments. Inclusion. We provide equitable opportunities for everyone. We lead the infrastructure industry with purpose, and our people are at the heart of everything we do. So, we invest in our people, just like they invest in us! At Aecon we: Ensure you and your family receive the services needed to support your mental, emotional, and physical well-being Believe in helping you build your career through our Aecon University and Leadership Programs Are committed to supporting and investing in inclusive work environments, through initiatives like Equity, Diversity & Inclusion training, our Aecon Women in Trades and Aecon Diversity in Trades programs, and our Employee Resource Groups (ERGs) to ensure we are building inclusion into every aspect of our culture at Aecon. Are a leader in sustainable construction. With a strong commitment to operating responsibility by minimizing our impact on the environment and surrounding communities. Our business success relies on strong execution and continuous improvement driven by the diversity, expertise and teamwork of our people. Were always searching the globe for innovative, collaborative minds to join our best-in-class Aecon community! Position Overview Aecon is well-positioned in the Canadian marketplace as an industry leader in the development and construction of infrastructure. We have a roster of ongoing major projects here and abroad, record backlog diversified across multiple sectors and duration, and a robust pipeline of future project pursuits. We are in a strong market position, but we are ultimately aiming higher. The Junior Data Engineering will join our Corporate Information Services team, reporting to the Director, Enterprise Data Architecture. The incumbent will be responsible to identify and meet data requirements, expand, and optimize our data and data pipeline architecture. The incumbent will also support our Business Analytics team, Business Users and Data Architecture team on data initiatives and will ensure optimal data delivery architecture is consistent throughout new and ongoing projects. Key Responsibilities Assemble large and complex data sets including but not limited to IoT data, SAP and Non-SAP data sets across various business domains. Create optimal data pipelines using Azure big data technologies. Develop and implement AI models that can analyze large volumes of data and generate insights. Collaborate with data scientists and software engineers to integrate AI models into production systems. Conduct performance testing and optimization of AI models and data pipelines Develop and maintain documentation for all data engineering and AI-related processes. Stay up to date with the latest trends in data engineering and AI technologies. Work with Various Stakeholders/IT to assist with data-related technical issues and support their requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required to support optimal data extraction, transformation, loading and modelling of data from wide variety of data sources. Work on making sure the data is secure, separated and govern across the data architecture platform. Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. Provide end user support as required. Required Qualification, Knowledge and Experience Bachelor's or Master's degree in Computer Science, Data Science, or a related field At least 1-3 years of experience in data engineering and AI model development Well versed with Large Language models and Foundation models and different architecture to implement the solution using these models. Strong experience with the Azure Data Platforms Azure Data Lake, Data Factory, Azure Synapse, Azure Storage, Stream Analytics, EventHub, Azure SQL 1-4 years of experience in a Data Engineering or similar type of role Minimum 2-year experience building big data pipelines, architecture, and data sets Minimum 2 years of experience in working with stream data ingestion and processing Experience in creating/managing a data warehouse, design-related extraction, loading data functions, testing designs and data modeling Experience with relational SQL and NoSQL databases, including Postgres and Cassandra Data discovery, visualization and reporting experience using SAP business Objects, SAP Analytics Cloud and Power BI Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc. CI/CD, Azure DevOps experience, highly desirable Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong analytic skills related to working with unstructured datasets. Strong project management and organizational skills. Experience supporting and working with cross-functional teams in a dynamic environment. Strong ability to lead and drive results. An ability to manage multiple assignments with minimal supervision. Preferred Qualification Data Engineering certification SAP BI, SAP HANA Data modelling using SAP HANA studio, Web IDE will be given preference. Aecon fosters diversity, inclusion and belonging within and across our organization. We welcome all to apply including, women, visible minorities, Indigenous peoples, persons with disabilities, and persons of any sexual orientation or gender identity. We are committed to adhering to the objectives and requirements outlined in the Accessible Canada Act (ACA), and to meeting the accessibility needs of persons with disabilities in a timely manner, through the implementation of the requirements of the ACA and its applicable regulations. If you require accommodation under the ACA during any step of the application process please click here.
Senior Consultant - Data Engineer
KPMG, Vancouver, BC
OverviewAt KPMG, you'll join a team of diverse and dedicated problem solvers, connected by a common cause: turning insight into opportunity for clients and communities around the world. Our Vancouver team are looking for a highly motivated Data Engineer at a Senior Consultant level to join our team! As a member of KPMG Canada's Lighthouse team, you will be dedicated to enabling and optimizing our clients' functionality across our data, analytics, and software solutions. This role will be a rewarding experience for you if you: Thrive in a growth focused environment, providing you exposure to rewarding projects that deliver impact to clients across different verticals. Work well in a project team environment and have excellent collaboration and interpersonal skills Have a "figure it out" mindset. Innovate . Collaborate. Shine. Lighthouse - KPMG Canada's Center of Excellence for data valorization and advanced analytics- applies data science to solve real-word business problems, operationalize AI and optimize emerging technologies for its mission. Join a diverse team which is always curious and learning, thinking independently, working collaboratively, has a passion to solve difficult problems, and has fun doing it. What you will doAs a Senior Consultant, you'll lead and work as part of a team of problem solvers with extensive consulting and industry experience, helping our clients solve complex business issues from strategy to implementation. Specific responsibilities include but are not limited to the following: Work with various engagement teams to understand business and gather technical data strategy and solutions for client engagements. Participate in architecture, development, deployment and maintenance of secure, extensive, scalable, repeatable and high-performing data platforms and optimal data pipelines to source data from variety sources within on-prem and cloud environments (MS Azure and AWS) to support streaming data, batch data flows. Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Perform data management tasks, including data architecture design and data modeling, master data/metadata/data security/privacy/data quality management, data operations, data integration and interoperability. Develop standards for data processes and automate routine tasks. Run queries for descriptive analytics and provide formatted result sets. Proactively contribute to the creation of presentation materials relating to data activities for stakeholder discussions. Create data tools for analytics and data scientist team members that assist them in developing and optimizing our product into an creative industry leader. Take ownership of projects and pushing technological boundaries. Establish and maintain effective working relationships with colleagues, existing clients, and prospective client organizations. Engage in and contribute to the innovation, growth, and enhancement of KPMG Lighthouse services. What you bring to the role 3+ years of experience in data engineering (supporting data/data platform architecture, design, implementation, and/or support of complex data and application architectures that are scalable and repeatable) ideally within a services/consulting organization. Experience with relational and non-relational databases, including query language (SQL). Experience developing and optimizing data pipelines, architectures and data models for all data types, speed, and sizes. Experience performing root cause analysis on internal and external data; processes to answer specific business questions and identify opportunities for improvement. Experience supporting in development of standards, processes supporting data, data transformation, data structures, metadata, dependency, and workload management. Experience manipulating, processing, and extracting value from large discrete datasets. Working knowledge of message queuing, stream processing, parallel processing, and data lakes. Experience deploying and maintaining high performance computing VMs in on-premise and cloud environment; Experience using a combination of industry relevant software/tools including some of the following: Relational SQL and NoSQL databases: MS SQL Server, PostgreSQL and Cosmos DB. Data Warehousing Solutions (Azure Synapsepreferred). Data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc. Cloud data services (Azure preferred). Object-oriented/object function scripting languages: Python(preferred), Java, C++, Scala. Big data tools: Hadoop, Spark, and Kafkaare desired. Stream-processing systems: Spark-Streaming, DataBricks, and Azure Event Hubsare desired. Keys to your success You recognize the big picture both as a consultant and as an internal contributor. You are motivated to create solutions that are practical and cost-effective solutions that are relevant to your clients' goals and challenges. You have developed a reputation as a knowledgeable professional in your area, possess a solution orientated and critical thinking mindset, and enjoy guiding others through complex and at times ambiguous challenges. You're a self-starter that takes initiative to constantly improve your skillset and contribute to the overall performance and success of the practice. You are an exceptional communicator, both verbally and written, with the ability to deliver professional communications, presentations, reports and documentation. You have developed a deep professional brand and presence in the community and enjoy contributing to and participating in events. KPMG BC Region Pay Range Information The expected base salary range for this position is $71,500 to $119,000 and may be eligible for bonus awards. The determination of an applicant's base salary within this range is based on the individual's location, skills, & competencies, and unique qualifications. In addition, KPMG offers a comprehensive and competitive Total Rewards program. #LI-Hybri Providing you with the support you need to be at your best For more information about KPMG in Canada's Benefits and well-being, click here . Our Values, The KPMG WayIntegrity, we do what is right | Excellence, we never stop learning and improving | Courage, we think and act boldly | Together, we respect each other and draw strength from our differences | For Better, we do what matters KPMG in Canada is a proud equal opportunities employer and we are committed to creating a respectful, inclusive and barrier-free workplace that allows all of our people to reach their full potential. A diverse workforce is key to our success and we believe in bringing your whole self to work. We welcome all qualified candidates to apply and hope you will choose KPMG in Canada as your employer of choice. For more information about Inclusion, Diversity & Equity in Recruitment, please click here . If you have a question about accessible employment at KPMG, or to begin a confidential conversation about your individual accessibility or accommodation needs through the recruitment process, we encourage you to visit our accessibility page .
112559 - Senior Data Engineer
Vancouver Coastal Health, Vancouver, BC
Senior Data Engineer Job ID 2023-112559 City Vancouver Work Location Corporate Admin-520 W 6th Home Worksite 00 - Excluded - VCHA Labour Agreement Excluded Union 905 - Mgt/Excluded-VCHA (37.5 Hr) Position Type Baseline Job Status Temporary Full-Time FTE 1.00 Standard Hours / Week 37.50 Job Category Information Technology Salary Grade 08U Min Hourly CAD $44.15/Hr. Max Hourly CAD $63.47/Hr. Shift Times 0800 to 1600 Days Off Saturday, Stats, Sunday End Date 12/8/2024 Salary The salary range for this position is CAD $44.15/Hr. - CAD $63.47/Hr. Job Summary Come work as a Senior Decision Support Advisor with Vancouver Coastal Health (VCH)! Vancouver Coastal Health is looking for a Senior Data Engineer to join the Data & Solutions Infrastructure Team. We are specifically looking for someone who has experience with Databricks. Apply today to join our team! As a Senior Data Engineer with Vancouver Coastal Health you will:Be a part of the Data & Solutions Infrastructure technical team with a primary focus on the ongoing sustainment and development of the organisation’s cloud and on-premises data infrastructure.Design and develop scalable, efficient, data integration solutions for large scale data analyses, model development, validation and implementationCollaborate with a variety of cross-functional team members to identify, develop, implement and maintain innovative solutions for the information needs of Vancouver Coastal Health.Identify, develop, implement and maintain innovative responses to the information needs of the organization as well as ensure the department’s technical infrastructure is developed and sustained.Apply experience in data modelling, data warehousing, building data pipelines, and excellent problem solving ability to tackle processing of high volumes of clinical data. Qualifications Education & ExperienceMasters Degree in Business Administration, Information Systems or equivalent.Five (5) years’ recent, related experience including technical experience in working with databases and decision support tools, preferably in a health care setting or equivalent industry/certification experience (e.g. Databricks Data Engineer Associate, Databricks Data Engineer Professional, Microsoft Azure Data Engineer Associate)Five (5) years experience in Information Technology with focus in database development, warehousing, and/or data engineering Competencies:Refer to the VCH enabling competencies for Professional Practitioner positions.Knowledge & AbilitiesDemonstrated proficiency using the Databricks platform and understanding the concepts of Databricks SQL, Delta Lake, cluster management.Strong proficiency in Databricks SQL, Python, Scala with an ability to write efficient and scalable code for data processing/transformation tasksDemonstrated experience building and optimizing data pipelines (ETL/ELT)Strong understanding of DevOps and MLOps practices focusing on version control (Git), CI/CD pipelines, and automated testing frameworksDemonstrated experience using big data processing frameworks such as Apache Spark & HadoopStrong understanding of Spark RDD’s, data frames, and API’s for data manipulationDemonstrated knowledge of data modelling and data warehousing principlesKnowledge using cloud data warehousing solutions (e.g. Azure SQL, Synapse Analytics, Snowflake, etc.) an assetKnowledge of Azure or other cloud infrastructure services (e.g Azure Data Factory, Azure DevOps) an assetExperience working in an Agile environmentAdvanced analytical and problem solving skills.Excellent writing and verbal communication skills.Knowledge of database design and report design at the intermediate to advanced level.Knowledge of healthcare databases an asset. Closing Statement The hours of work including days off and work area may be subject to change consistent with operational requirements and the provision of the Collective Agreement and applicable statutes. Successful applicants may be required to complete a Criminal Records Review Check.As per the current Public Health Orders, as of October 5, 2023, all employees working for Vancouver Coastal Health must be fully vaccinated for COVID-19 or have received a single dose of the most-recent, updated COVID-19 vaccine. Proof of vaccination status will be required.WHY JOIN VANCOUVER COASTAL HEALTH?VCH is a world class innovator in medical care, research and teaching, delivering service to more than one million BC residents. At VCH, we embrace thinking boldly, taking smart risks, and 'going first' when we believe it will lead to the best possible outcomes for patients and their families. We invite you to join us in creating healthy lives in healthy communities by showcasing our passion for care, connection to the communities we serve and our culture of teamwork that makes VCH a great place to work.• Comprehensive health benefits package, including MSP, extended health and dental and municipal pension plan• Grow your career with employer-paid training and leadership development opportunities• Wellness supports, including counselling, critical incident and innovative wellness services are available to employees and their immediate families• Award-winning recognition programs to honour staff, medical staff and volunteers• Access to exclusive discount offers and deals for VCH staff Equity, diversity, and inclusion are essential to our goals of creating a great place to work and delivering exceptional care. We acknowledge and accommodate unique differences and ensure special measures are in place so that all prospective and current employees are given an opportunity to succeed.We are committed to building a representative workforce and encourage applications reflecting diversity of sex, sexual orientation, gender identity or expression, racialization or ancestry, disability, political belief, religion, marital or family status, age, and/or status as a First Nation, Metis, Inuit, or Indigenous person.Vancouver Coastal Health is proud to be recognized as one of Canada's Top 100 Employers in 2024.Only short-listed applicants will be contacted for this posting. ***Employees of VCH must apply online via the Internal Career Portal on CareerHub, you are currently viewing the External Career Portal. Refer to the https://my.vch.ca/working-here/job-postings site for instructions on how to view internal job postings and how to apply as an employee. Current VCH employees who apply to this posting using this external site will be considered as an external candidate. Seniority will not apply.***Thank you for your interest in Vancouver Coastal Health. Options Apply NowApplyShareEmail this job to a friendRefer Sorry the Share function is not working properly at this moment. Please refresh the page and try again later. Share on your newsfeed Application FAQsSoftware Powered by iCIMSwww.icims.com
Int Data Engineer to design, build and maintain data infrastructure
S.i. Systems, Montreal, QC
Our valued client is looking for an Int Data Engineer to design, build and maintain data infrastructure. initial 3-6 month contract hybrid in Ottawa with possibility for conversion to full time permanent. (min 2 days a week onsite) Responsibilities:Designing, building, and maintaining data infrastructure that supports the efficient extraction, transformation, and loading (ETL) of data from various sources.Develops ways to increase the usability of data through changes in business process and/or data cleansing techniquesDesign, build, and maintain data pipelines and ETL processes using tools such as Streamsets, Apache Spark, Hadoop, and other big data technologies.Develop and maintain data infrastructure including data warehouses and other data storage solutions to support business intelligence and reporting needsDesign and implement data security measures to protect sensitive data.Develop and maintain data quality control processes to ensure the accuracy, reliability, and accessibility of data to all stakeholders.Monitors system performance of scheduled ETL batches, and streaming data process and ensures all systems are working at an acceptable standard and that data pipelines are scalable, repeatable, and secure.Performs data migration between development, UAT, and production systems, and plans and coordinates these data migrations.Analyzes and troubleshoots technical issues quickly to resolution, working with internal ITS sections and software vendors, when required.Must Have Skills:5+ years experience in a Data Analytics environment with progressively more technical responsibilities in an engineering roleDesigning and creating: ETL processes, data pipelines and workflows.Logical and physical data models using data modelling best practices.Develop scripts, applications and APISs to automate data processing tasks using programming languages such as SQL, Python, Java, Scala, shell scripting, JavaScriptDesigning, building, and supporting data warehouses, data hubs and other data storage and processing systems.Nice to Have Skills:Experience with cloud computing platforms such as Azure and being familiar with setting up and managing cloud-based data storage and computing environments.Working with stream processing frameworks such as Apache Kafka or Streamsets Designing and implementing:Real-time data processing pipelines using these tools.Database solutions using technologies such as MySQL, PostgreSQL or SQL Server. Project implementation analysis and support in data management systems, data integrity and security as it relates to environmental business systems.Machine Learning concepts and tools (R, Python, Jupyter Notebook).Utilization of tools such as Apache Spark, Hadoop, and other big data technologies. Apply
Data Engineer
CGI Group, Markham, ON
Position Description: We are Canada's largest independent information technology services firm, and after 40 years, we're still growing! Innovation, technology, and service delivery are our focus. Our goal is to ensure our clients remain ahead of the competition. We provide a full spectrum of managed services from IT and business process outsourcing to systems integration and consulting that are transforming our clients’ operations and helping them to succeed.We have an excellent opportunity for a Data Engineer to be part of a team that advocates the use of advanced Data technologies to solve business problems, deliver modern solutions while building close relationships with both client and CGI team members. Your future duties and responsibilities: • Build, own and run data engineering pipelines, workflows and standardized practices for delivering new products and capabilities using Data and Cloud technologies, including data transformation, processing and analysis.• Architect, develop, and maintain real-time data streaming solutions using Apache Kafka.• Design and implement data flow management systems to ensure smooth and reliable data ingestion and processing.• Develop and maintain schema registry for ensuring data schema compatibility and evolution.• Optimize Kafka cluster performance.• Integrate Kafka streams with Big Query and other data storage systems for analytics and reporting purposes.• Implement data security measures to protect sensitive data during transit and storage.• Work closely with the Data Engineering team to develop efficient ETL processes and ensure data quality.• Troubleshoot production issues.• Perform optimization and tuning of data workflow processes and applications.• Love solving complex problems, be up-to-date on the latest technology trends and have a strong desire to learn constantly. Required qualifications to be successful in this role: • Bachelors’ in computer science, Engineering Management Information Systems, or Computer Information Systems is required.• Proficiency in Java and other programming languages used for data engineering purposes of at least 5+ years.• Must have GCP data platform experience.• Experience with setting up and managing Kafka clusters in production environments.• Familiarity with schema registry and schema evolution concepts.• Deep understanding of techniques used in creating and serving schemas at the time of data consumption.• Hands-on experience with Google Big Query or other similar cloud-based data warehousing systems.• Strong understanding of data flow management and ETL principles.• Strong interpersonal and excellent written and verbal communication skills, combined with comprehensive, practical experience and knowledge in end-to-end delivery of Big Data solutions.Preferred Qualifications:• Familiarity with various data storage and processing technologies, such as relational databases, NoSQL, Hadoop.• Proficient in programming languages and frameworks commonly used in data engineering, such as Python, SQL and Spark.• Experience in AWS/Azure data platforms• Familiarity with different development methodologies (e.g. waterfall, agile, XP, scrum)• Understanding of data governance and data management practices.• Experience working with multiple clients and projects at a time. #LI-KM1 Skills: Solution AnalysisDevOpsEnglishRequirements Analysis What you can expect from us: Together, as owners, let’s turn meaningful insights into action.Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction.Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise.You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team-one of the largest IT and business consulting services firms in the world.
Data Engineer
Origineer Consulting Inc, Burnaby, BC, CA
Situated in the Greater Vancouver areas, Origineer Consulting is a consulting firm specializes in offering a wide array of services, including professional consulting, strategic planning, creative execution, and intelligent technology utilization. Our adept team, with expertise across various areas, delivers well-rounded strategies and solutions. We are looking for an experienced Data Engineer to join our dynamic team.Job Responsibilities:*Collect data from various sources like databases, spreadsheets, surveys, and external datasets.*Clean and preprocess data to remove errors, ensuring accuracy for analysis.*Use statistical and mathematical techniques to analyze large datasets, and identify trends, patterns, and correlations.*Develop and implement statistical models or machine learning algorithms for predicting future trends based on historical data.*Create visual representations of data with charts and graphs for easier understanding by stakeholders.*Draw meaningful conclusions from data analysis and communicate findings clearly to non-technical stakeholders.*Utilize databases for efficient data storage, retrieval, and manipulation, often requiring SQL skills.*Stay updated on industry trends and technologies to enhance data analysis skills.Job Requirements:*Bachelor degree or college program in statistics, mathematics, computer science, computer systems engineering, or other related discipline.*3 years in programming, data analysis or other related field.*Consideration may be given to an equivalent combination of work experience and education.*Strong analytical skills.*Attention to detail.*Communication and collaboration skills.Contact Method:We invite candidates from diverse backgrounds to apply; however, please be aware that preference will be given to qualified citizens or permanentresidents for this position. Kindly submit your resume to . Please be informed that only selected candidates will be contacted forinterviews. We appreciate the interest of all applicants.
Data Engineer III-Big Data
JPMorgan Chase, Bengaluru, Any, India
Be part of a dynamic team where your distinctive skills will contribute to a winning culture and team.As a Data Engineer III-Big Data at JPMorgan Chase within the Corporate & Investment Bank Payments Technology Team, you serve as a seasoned member of an agile team to design and deliver trusted data collection, storage, access, and analytics solutions in a secure, stable, and scalable way. You are responsible for developing, testing, and maintaining critical data pipelines and architectures across multiple technical areas within various business functions in support of the firm's business objectives.Job responsibilities Supports review of controls to ensure sufficient protection of enterprise data Advises and makes custom configuration changes in one to two tools to generate a product at the business or customer request Updates logical or physical data models based on new use cases Frequently uses SQL and understands NoSQL databases and their niche in the marketplace Adds to team culture of diversity, equity, inclusion, and respectRequired qualifications, capabilities, and skills Formal training or certification on data lifecycle concepts and 3+ years applied experience Experience across the data lifecycle Experience with Batch and Real time Data processing with Spark or Flink. Working knowledge of AWS Glue and EMR usage for Data processing. Experience working with Databricks. Experience working with Python/Java, PySpark etc., Advanced at SQL (e.g., joins and aggregations) Working understanding of NoSQL databases Significant experience with statistical data analysis and ability to determine appropriate tools and data patterns to perform analysis.Preferred qualifications, capabilities, and skills Worked with building Data lake, built Data platforms, built Data frameworks, Built/Design of Data as a Service APIAbout usJPMorgan Chase & Co., one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world's most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management.We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants' and employees' religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation.About the TeamThe Corporate & Investment Bank is a global leader across investment banking, wholesale payments, markets and securities services. The world's most important corporations, governments and institutions entrust us with their business in more than 100 countries. We provide strategic advice, raise capital, manage risk and extend liquidity in markets around the world.Salary: . Date posted: 03/26/2024 10:23 PM
Cloud Security Architect, Deloitte Global Technology
Deloitte,
Job Type:Permanent Work Model:[[workModel]] Reference code:126048 Primary Location:Toronto, ON All Available Locations:Toronto, ON; Burlington, ON; Calgary, AB; Edmonton, AB; Fredericton, NB; Halifax, NS; Kitchener, ON; Moncton, NB; Ottawa, ON; Regina, SK; Saint John, NB; Saskatoon, SK; St. John's, NL; Vancouver, BC; Victoria, BC; Winnipeg, MB Our Purpose At Deloitte, we are driven to inspire and help our people, organization, communities, and country to thrive. Our Purpose is to build a better future by accelerating and expanding access to knowledge. Purpose defines who we are and gives us reason to exist as an organization. By living our Purpose, we will make an impact that matters. Learn from deep subject matter experts through mentoring and on the job coaching Have many careers in one Firm. Partner with clients to solve their most complex problems What will your typical day look like? The Cloud Security Architect will be responsible for providing strategic direction and operational excellence in cyber security standards, security solutions and will play a crucial role in designing, implementing, and maintaining robust cybersecurity solutions, strategies, and policies.This role will ensure that the stakeholder security requirements necessary to protect the organization's mission and business processes are adequately addressed in all aspects of enterprise architecture including operating models, strategic roadmaps and solution architectures. The Cloud Security Architect will identify suitable solutions, define requirements and evaluation criteria, conduct POC/POT and publish the analysis of the findings. The incumbent will have a deep understanding of cybersecurity technologies, a strong grasp and continued learning of industry best practices, and the ability to lead and collaborate with cross-functional teams. Responsibilities: Provide thought leadership and own the technical engagement and ultimate success around specific innovation challenge projects and define potential implementation architectures Stay informed about emerging cloud security trends, technologies and threats Develop, socialize and implement comprehensive multi-stakeholder cloud security strategies aligned with organizational goals while incorporating People, Process and Technology Influence and guide major stakeholders and partner with them to implement cloud security solutions in public cloud environments Research new technologies, processes, and solutions to enhance capabilities across the enterprise Ensure team is following industry standards, regulations, technical guidelines, security needs, and design patterns aligned to enterprise architecture and strategy About the team Deloitte Technology works at the forefront of technology development and processes to support and protect Deloitte around the world. In this truly global environment, we operate not in "what is" but rather "what can be" to help Deloitte deliver and connect with its clients, its communities, and one another in ways not previously conceived.Enough about us, let's talk about you In this role, the candidate will bring a deep understanding of cybersecurity and data privacy principles, extensive experience in building cyber security programs and a proven track record of success in leading cross-functional teams. The successful candidate will be an experienced professional of cloud security technologies, a big picture thinker, and a proficient presenter, communicator who is ready to be the focal point driving force of strategic ideas and bring those ideas to execution & realization. In this role, the candidate will have: Prior experience as a Cloud Security Architect or similar role In-depth understanding and applied experience with cloud and cyber security frameworks, standards, and best practices Hands on experience with implementing cloud native security services and platforms. Robust interpersonal, verbal presentation and written communication skills with the ability to work independently Experience interfacing with stakeholders and end users utilizing consulting and negotiating skills using storytelling and an executive language. Expert knowledge of cloud methodologies (IaaS, PaaS, SaaS), automation, orchestration, trends alongside industry-leading cloud vendor offerings and integrations Project management and service definition and delivery experience. Service development and delivery experience. Project management experience Possess One of the certifications below (Mandatory): Azure Solutions Architect Expert OR; Google Professional Cloud Architect OR; AWS Solutions Architect Professional Desired Certifications but not mandatory: ISC2 CCSP AWS Security Specialty Google Professional Cloud Security Engineer Azure Security Engineer Expert Total RewardsThe salary range for this position is $69,000 - $114,000, and individuals may be eligible to participate in our bonus program. Deloitte is fair and competitive when it comes to the salaries of our people. We regularly benchmark across a variety of positions, industries, sectors, targets, and levels. Our approach is grounded on recognizing people's unique strengths and contributions and rewarding the value that they deliver.Our Total Rewards Package extends well beyond traditional compensation and benefit programs and is designed to recognize employee contributions, encourage personal wellness, and support firm growth. Along with a competitive base salary and variable pay opportunities, we offer a wide array of initiatives that differentiate us as a people-first organization. Some representative examples include: $4,000 per year for mental health support benefits, a $1,300 flexible benefit spending account, 38+ days off (including 10 firm-wide closures known as "Deloitte Days"), flexible work arrangements and a hybrid work structure.Our promise to our people: Deloitte is where potential comes to life. Be yourself, and more. We are a group of talented people who want to learn, gain experience, and develop skills. Wherever you are in your career, we want you to advance. You shape how we make impact. Diverse perspectives and life experiences make us better. Whoever you are and wherever you're from, we want you to feel like you belong here. We provide flexible working options to support you and how you can contribute. Be the leader you want to be. Be the leader you want to be Some guide teams, some change culture, some build essential expertise. We offer opportunities and experiences that support your continuing growth as a leader. Have as many careers as you want. We are uniquely able to offer you new challenges and roles - and prepare you for them. We bring together people with unique experiences and talents, and we are the place to develop a lasting network of friends, peers, and mentors. Our TVP is about relationships - between leaders and their people, the firm and its people, peers, and within in our communities.The next step is yours At Deloitte, we are all about doing business inclusively - that starts with having diverse colleagues of all abilities. Deloitte encourages applications from all qualified candidates who represent the full diversity of communities across Canada. This includes, but is not limited to, people with disabilities, candidates from Indigenous communities, and candidates from the Black community in support of living our values, creating a culture of Diversity Equity and Inclusion and our commitment to our AccessAbility Action Plan , Reconciliation Action Plan and the BlackNorth Initiative . We encourage you to connect with us at [email protected] if you require an accommodation for the recruitment process (including alternate formats of materials, accessible meeting rooms or other accommodations) or [email protected] for any questions relating to careers for Indigenous peoples at Deloitte (First Nations, Inuit, Métis). By applying to this job you will be assessed against the Deloitte Global Talent Standards. We've designed these standards to provide our clients with a consistent and exceptional Deloitte experience globally. Deloitte Canada has 30 offices with representation across most of the country. We acknowledge our offices reside on traditional, treaty and unceded territories as part of Turtle Island and is still home to many First Nations, Métis, and Inuit peoples. We are all Treaty people.Job Segment: Cyber Security, Information Technology, IT Architecture, Solution Architect, Developer, Security, Technology
Senior Cloud Data Engineer (GCP) with Python & SQL to build data pipelines in the enterprise data lake and contribute to the design of data flow.
S.i. Systems, Toronto, ON
Our client is looking for a Senior Cloud Data Engineer (GCP) with Python & SQL to build data pipelines in the enterprise data lake and contribute to the design of data flow. Duration: Until Dec 31st initiallyProject: Build data pipelines in the enterprise data lake cerebro which will now cover more use cases. Cerebro is expanding.Our client is building its own ML Zone - ML Ops platform within data lake platform. Deploy ML solutions and analytics solutions that are more complex then a pipeline. Duration: 1 YearLocation: Brampton - 3 days/week Must Have Skills:Cloud Data Engineering - Data Warehouse and Data Lake SolutionsGoogle Cloud Platform (GCP), AWS or AzurePython, SQLData Modelling - vet on the following; data modeling techniques: hierarchical, network, relational, object-oriented, entity-relationship, dimensional, and graph. Apply
Cloud Support Associate Intern - Japanese Fluency
Amazon, Toronto, Ontario
DESCRIPTIONAmazon Web Services (AWS) internships are full-time (40 hours/week) for 12 consecutive weeks during summer. By applying to this position, your application will be considered for the Toronto, ON internships.Would you like to use the latest cloud computing technologies? As a Cloud Support Associate Intern you will learn to solve critical, highly complex customer problems that may span multiple AWS services and partner with AWS teams to help reproduce and resolve customer issues. Our Cloud Support team provides technical support to our customers across the globe. The Cloud Support Associate Internship is a training program for the Cloud Support Associate role. This is an excellent opportunity to join one of Amazon's technical teams, working with some of the best and brightest engineers, while also developing your skills and furthering your career within one of the most innovative and progressive technology companies.If this sounds exciting to you - come build the future with us!Key job responsibilities• Gaining experience with AWS cloud services including EC2, load balancers, and S3 storage solutions • Collaborating with AWS engineers and fellow peers in a virtual classroom to learn about AWS services and practice troubleshooting techniques • Applying new skills through customer case simulationsWe are open to hiring candidates to work out of one of the following locations:Toronto, ON, CANBASIC QUALIFICATIONS• Currently enrolled in a Bachelor degree program with a graduation conferral date between December 2024 and August 2025• Experience or coursework in networking and operating systems• Fluency in Japanese and EnglishPREFERRED QUALIFICATIONS• Knowledge of internet fundamentals and cloud computing concepts• Experience troubleshooting networks (e.g., TCP/IP, DNS, routing, switching, firewalls, LAN/WAN, traceroute, iperf, dig, cURL or similar/related technology)• Experience with at least one functional scripting language (e.g., Perl, Python, Ruby, shell scripting)Amazon is committed to a diverse and inclusive workplace. Amazon is an equal opportunity employer and does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, disability, age, or other legally protected status. If you would like to request an accommodation, please notify your Recruiter.Salary: . Date posted: 03/19/2024 10:19 PM
Data Engineer
WorkSafeBC, Richmond, BC
Overview Are you passionate about creating re-usable analytics datasets which are used to optimize business decisions, provide departmental oversight, track the success of our business objectives, and explore the next new business improvement insight? In this role, you will be responsible for ensuring the quality, timeliness, availability, and design of data for analytics at WorkSafeBC. How you'll make a difference: You'll turn data into knowledge that benefits workers and employers across B.C. Where you'll work WorkSafeBC, we offer a hybrid work model that combines the convenience of working remotely with the dynamism of working in one of our offices, based on the operational needs of the position. In this role, you'll work primarily from your B.C. home and occasionally in our Richmond office. What you'll do As a data engineer you will: Ingest a wide variety of data in data platforms for analytics Create datasets for analytics by designing logical data models and turning them into physical data structures Build and maintain ETL pipelines by sequencing and scheduling ETL scripts Ensure analytics-ready data is of sufficient quality to make critical operational, tactical, and strategic business decisions Contribute and work under the parameters of organization data governance, security policies, and privacy requirements Administer analytics data platforms by working with various technology professionals Develop resources which aid analytics data platform users by developing data models, data dictionary definitions, and source-to-target mappings Conduct impact analysis for upstream application changes Maintain a strong relationship with analytics solution technology and analytics delivery teams Is this a good fit for you? We're looking for someone who can: Demonstrate ability to independently organize and prioritize a fluctuating workload, delegating where appropriate, while maintaining accuracy and timelines Demonstrate ability to teach, mentor and lead others, ensuring knowledge transfer within the workgroup Demonstrate ability to establish and maintain credibility within and outside of the immediate team through collaboration and using one's own expertise to deliver high quality results Demonstrate ability to elicit business requirements using interviews, document analysis, requirement workshops, surveys, site visits, business process descriptions, business cases, scenarios and workflow analysis and make appropriate recommendations or decisions in a proactive manner Demonstrate ability to use a variety of computer-based business intelligence reporting, dashboard, and other tools to create medium to high-complexity end-user solutions Demonstrate working knowledge of relational database systems, concepts, structures and business and technical models Demonstrate specialist knowledge of creating scripts/jobs/packages to extract data from sources, transform data, and load data into data stores for use in business intelligence, whilst ensuring a high degree of data integrity, data quality, data store performance, storage efficiency and other requirements Demonstrate working knowledge of logical design and data modeling, ideal for use in business intelligence solutions, including entity relationship diagrams, source-to-target mappings, data dictionaries, and understanding end-to-end data lineage from the originating source to the end solution to determine impacts of changes Your background and experience A degree in Computer Science or STEM (Science, Technology, Engineering, Math) A minimum of 3 years of analytics-related experience, inclusive of a minimum of 2 years in a role building SQL/ETL for analytics data platform We'll consider an equivalent combination of education and experience. Salary: $43.57- $54.91/hourly
Data Engineer
WorkSafeBC, Richmond, BC
Overview Are you passionate about creating re-usable analytics datasets which are used to optimize business decisions, provide departmental oversight, track the success of our business objectives, and explore the next new business improvement insight? In this role, you will be responsible for ensuring the quality, timeliness, availability, and design of data for analytics at WorkSafeBC. How you'll make a difference: You'll turn data into knowledge that benefits workers and employers across B.C. Where you'll work WorkSafeBC, we offer a hybrid work model that combines the convenience of working remotely with the dynamism of working in one of our offices, based on the operational needs of the position. In this role, you'll work primarily from your B.C. home and occasionally in our Richmond office. What you'll do As a data engineer you will: Ingest a wide variety of data in data platforms for analytics Create datasets for analytics by designing logical data models and turning them into physical data structures Build and maintain ETL pipelines by sequencing and scheduling ETL scripts Ensure analytics-ready data is of sufficient quality to make critical operational, tactical, and strategic business decisions Contribute and work under the parameters of organization data governance, security policies, and privacy requirements Administer analytics data platforms by working with various technology professionals Develop resources which aid analytics data platform users by developing data models, data dictionary definitions, and source-to-target mappings Conduct impact analysis for upstream application changes Maintain a strong relationship with analytics solution technology and analytics delivery teams Is this a good fit for you? We're looking for someone who can: Demonstrate ability to independently organize and prioritize a fluctuating workload, delegating where appropriate, while maintaining accuracy and timelines Demonstrate ability to teach, mentor and lead others, ensuring knowledge transfer within the workgroup Demonstrate ability to establish and maintain credibility within and outside of the immediate team through collaboration and using one's own expertise to deliver high quality results Demonstrate ability to elicit business requirements using interviews, document analysis, requirement workshops, surveys, site visits, business process descriptions, business cases, scenarios and workflow analysis and make appropriate recommendations or decisions in a proactive manner Demonstrate ability to use a variety of computer-based business intelligence reporting, dashboard, and other tools to create medium to high-complexity end-user solutions Demonstrate working knowledge of relational database systems, concepts, structures and business and technical models Demonstrate specialist knowledge of creating scripts/jobs/packages to extract data from sources, transform data, and load data into data stores for use in business intelligence, whilst ensuring a high degree of data integrity, data quality, data store performance, storage efficiency and other requirements Demonstrate working knowledge of logical design and data modeling, ideal for use in business intelligence solutions, including entity relationship diagrams, source-to-target mappings, data dictionaries, and understanding end-to-end data lineage from the originating source to the end solution to determine impacts of changes Your background and experience A degree in Computer Science or STEM (Science, Technology, Engineering, Math) A minimum of 3 years of analytics-related experience, inclusive of a minimum of 2 years in a role building SQL/ETL for analytics data platform We'll consider an equivalent combination of education and experience. Important to know Before we can finalize any offer of employment, you must: Consent to a criminal record check Confirm you're legally entitled to work in Canada WorkSafeBC's COVID-19 Employee Mandatory Vaccine Policy (the "Policy") is suspended effective January 9, 2023, however we reserve the right to re-implement it in response to changes in the public health landscape, including public health orders. We are committed to the protection, health, and safety of our employees and our Communicable Disease Prevention Program and related protocols remain in effect. Who we are At WorkSafeBC, we promote safe and healthy workplaces across British Columbia. We partner with workers and employers to save lives and prevent injury, disease, and disability. When work-related injuries or diseases occur, we provide compensation and support injured workers in their recovery, rehabilitation, and safe return to work. We're honoured to serve the 2.49 million workers and 263,000 registered employers in our province. What's it like to work at WorkSafeBC? It's challenging, stimulating, and rewarding. Our positions offer diversity and opportunities for professional growth. Every day, the work we do impacts people and changes lives. What we do is important, and so are the people we do it for. Our ability to make a difference relies on building a team with a rich variety of skills, knowledge, backgrounds, abilities, and experiences that reflects the diversity of the people we serve. We are committed to fostering a welcoming, inclusive, and supportive work culture where everyone can contribute as their best, authentic self. Learn more: Discover who we are . Our benefits As a member of our team, you'll have access to services and benefits that help you get the most out of work - and life. Along with a competitive salary, your total compensation package includes: Defined benefit pension plan that provides you with a lifetime monthly pension when you retire 3 weeks of vacation in your first year, with regular increases based on years of service Extensive health care and dental benefits Optional leave and earned-time-off arrangements Development opportunities (tuition reimbursement, leadership development, and more) Learn more: Find out what we offer . Salary: $43.57 - $54.91/hourly Want to apply? Applications are welcomed immediately, however must be received no later than 4:30 p.m. PST on the closing date. Please note that we will be starting assessments prior to the closing date. We encourage all qualified applicants to apply . If you require an accommodation in the assessment process, please email Recruitment Testing Accommodation (SM) when you submit your application. Any additional application materials must be received by email to HR Talent Acquisition (SM) by 4:30 p.m. PST on the closing date of the competition.
Data Engineer
Discovery, Inc. (Formerly Scripps Networks Interactive), Hyderabad, Any, India
Every great story has a new beginning, and yours starts here. Welcome to Warner Bros. Discovery... the stuff dreams are made of. Who We Are... When we say, "the stuff dreams are made of," we're not just referring to the world of wizards, dragons and superheroes, or even to the wonders of Planet Earth. Behind WBD's vast portfolio of iconic content and beloved brands, are the storytellers bringing our characters to life, the creators bringing them to your living rooms and the dreamers creating what's next...From brilliant creatives, to technology trailblazers, across the globe, WBD offers career defining opportunities, thoughtfully curated benefits, and the tools to explore and grow into your best selves. Here you are supported, here you are celebrated, here you can thrive. Your New Role :This role will work with a fast-paced team to create the data technology stack that can be used to deliver end-user insights used for developing the next generation of digital products and product strategies. You will also help build out an internal data pipeline and develop ways to optimize data technologies to shape our digital data strategy. Your Role Accountabilities: Gain an understanding of brand and enterprise data warehouse/reporting requirements that will inform the data architecture and modeling approach. Employ these learnings to construct appropriate ETL processes, database queries, and data/permission models that support data aggregation and reporting on actionable insights. Passion to write code which is efficient, organized, simple and scalable meeting business requirements. Enthusiasm to learn and find opportunities to enhance and adapt in daily engineering activities is highly desired. Debug, troubleshoot, design, and implement solutions to complex technical issues. Deliver end-to-end JIRA User stories meeting quality expectations of the team. Familiarity with BI Tools and able to create semantic layer models for the business users. Participate in QA testing for data pipeline projects as well as implementation changes to the suite of analytical tools. Monitor batch data loads to meet SLA's. You should be able to quickly respond and resolve production issues.Ability to thrive in a team-based environment and Flexibility to work in second shift. Qualifications & Experiences: Bachelor's degree in computer science, information systems, or information technology.5-8 years of experience in data engineeringKnowledge of supporting data sets for Home Entertainment, Games DVD/Digital business, Content sales and Licensing.Knowledge of SAP supply chain, APO, Order management, Trade spends, promotions, POS (Point of Sale), Royalty, Forecasting and cash collections.Experience with programming languages - SQL, Python, AWS (Amazon Web Services) GlueStrong experience with MPP databases (Teradata & Snowflake)Experience with Snow pipe, tasks, streams, clustering, Time travel, Cache, data sharingExperience in Conceptual/Logical/Physical Data Modelling & expertise in Relational and Dimensional Data ModellingExperience with AWS cloud services - Kinesis, Lambda, IAM (Identity and Access Management) PoliciesExperience in SQL query tuning and cost optimizationExperienced in software delivery through continuous integration (for example git, bitbucket, Jenkins, etc.)Experienced in one or more automation and scheduling tools (for example Redwood, Airflow, etc.)Must be comfortable working in a Linux/Unix environment.Familiarity with ASW Developer tools services like Code Deploy, Data PipelineExperience with public/private API integration, web scrapping, data streaming architectureKnowledge of Business content interchange (BCI). Not Required but preferred experience: Public speaking and presentation skills.Experience with DBT. How We Get Things Done... This last bit is probably the most important! Here at WBD, our guiding principles are the core values by which we operate and are central to how we get things done. You can find them at www.wbd.com/guiding-principles/ along with some insights from the team on what they mean and how they show up in their day to day. We hope they resonate with you and look forward to discussing them during your interview. The Legal Bits... Warner Bros. Discovery embraces the opportunity to build a workforce that reflects the diversity of our society and the world around us. Being an equal opportunity employer means that we take seriously our responsibility to consider qualified candidates on the basis of merit, regardless of sex, gender identity, ethnicity, age, sexual orientation, religion or belief, marital status, pregnancy, parenthood, disability or any other category protected by law.If you're a qualified candidate and you require adjustments or accommodations to search for a job opening or apply for a position, please contact us at [email protected]: . Date posted: 03/22/2024 10:04 AM
Security DevOps Engineer
SAP, Toronto, ON
We help the world run better Our company culture is focused on helping our employees enable innovation by building breakthroughs together. How? We focus every day on building the foundation for tomorrow and creating a workplace that embraces differences, values flexibility, and is aligned to our purpose-driven and future-focused work. We offer a highly collaborative, caring team environment with a strong focus on learning and development, recognition for your individual contributions, and a variety of benefit options for you to choose from. Apply now! ***To be eligible for this position you are required to obtain and maintain a Canadian clearance meaning you MUST be a Canadian Citizen or have had your PR card for more than 5 years with a clean background*** Job Description: The DevOps Engineer is accountable for automating all the manual tasks for developing and deploying code and data to implement continuous deployment and continuous integration frameworks. They are also held responsible for maintaining high availability of production and non-production work environments.They should possess sound knowledge of various tools and technologies used by other team members. The DevOps Engineer handles the entire DevOps lifecycle and is accountable for the implementation of the Dev / Ops process in the Sovereign Cloud Services organization. Job Requirements / Key Responsibilities: The DevOps Engineer is responsible for creating software deployment strategies that are essential for the successful deployment of software in the work environment. They identify and implement data storage methods like clustering to improve the performance of the team. The DevOps engineer is responsible for coming up with solutions for managing a vast number of documents in real-time and enables quick search and analysis. He also identifies issues in the production phase and system and implements monitoring solutions to overcome those issues. The DevOps Engineer will stay abreast of industry trends and best practices. They conduct research, tests, and execute new techniques which could be reused and applied to the software development project. The DevOps Engineer is accountable for building and optimizing automation systems that help to execute business web and data infrastructure platforms. The DevOps Engineer also develops self-service solutions for the engineering department to deliver software with excellent quality and speed. They are also involved in designing and developing scaling strategies, automation scripts, and solutions to implement, streamline, and execute the software. The DevOps Engineer is involved in creating technology infrastructure, automation tools, and maintaining configuration management. They are accountable for conducting training sessions for the juniors in the team, and other groups regarding how to build processes wherein the dependencies are showcased in the code. They are also answerable for the architecture and technical leadership of the complete DevOps infrastructure. The Engineer also deals with tasks like management and development of continuous integration and deployment solutions across various sites. To cater to the engineering department's quality and standard, the DevOps needs to implement lifecycle infrastructure solutions and documentation operations. Qualifications: Bachelor or master's degree in computer science, information systems, or a related engineering discipline with enthusiasm for security and technology A DevOps Engineer should have a degree in Computer Science or any other related field. Proficiency with SIEM solutions (Tennable, Nessus, Burp suite, Splunk, QRadar, Logrythm, etc.). Deep understanding on security vulnerabilities and assessing its impact on products. They should have ample work experience in the same domain. Candidates working for this position should possess at least 3 years of work experience as a DevOps Engineer. Candidates must possess ample knowledge and experience in system automation, deployment, and implementation. Candidates must possess some experience in using Linux, NetWeaver and ample experience in configuring and automating monitoring tools. The candidates should also possess experience in software development process and tools and languages The DevOps Engineer should possess excellent communication skills, which is essential to execute his duty to the juniors in the team. Having excellent communication skills will ensure that information is conveyed clearly, which in turn determines the team's performance. A DevOps Engineer must be a result-oriented individual, be self-motivated, and be proactive beyond their duty. They should be capable of multi-tasking, meeting deadlines, should remain calm during uncertainties, and working in a collaborative environment, creative, highly analytical, and strategic thinkers. Candidates aspiring to this position should be technologically adept , h ave computer skills, and possess expertise in scaling distributed data systems. Desired candidates should possess skills in configuration, maintenance, and securing systems. Candidates should possess sound knowledge of cloud automation tools like, such as Cloud Foundry / AWS / Azure / Docker / Kubernetes / Terraform / Ansible / Packer, SAML SSO, OAuth, Microservices etc. They should also have a thorough understanding of Google Cloud Platform, Hadoop, NoSQL databases, and big data clusters. The DevOps engineer should be capable enough to make use of all mentioned above skills to create an integrated and completely automated work environment. Starting from source code management to deployment state, i.e., Continuous Delivery, Continuous Integration, and Continuous Deployment, it has to be ensured that there is no unnecessary intervention in between. Experience with source code control and collaboration such as GIT and Jira The Team: SAP recently decided to combine and harmonize all SAP Sovereign Cloud offerings, establishing an end-to-end process for the Public Sector and regulated industries. The business unit works with SAP Government Security & Secrecy (GS2) to further strengthen its mission to support governments and nations in protecting their most vital assets. SAP Sovereign Cloud Services is collaborating closely with all Board areas to establish a harmonized, scalable, and consistent offering with transparent and standardized deliverables. SAP Diversity Comment: To harness the power of innovation, SAP invests in the development of its diverse employees. We aspire to leverage the qualities and appreciate the unique competencies that each person brings to the company. SAP is committed to the principles of Equal Employment Opportunity and to providing reasonable accommodations to applicants with physical, sensory and/or mental disabilities. If you are interested in applying for employment with SAP and are in need of accommodation or special assistance to navigate our website or to complete your application, please contact us at [email protected]. Requests for reasonable accommodation will be considered on a case-by-case basis. We build breakthroughs togetherSAP innovations help more than 400,000 customers worldwide work together more efficiently and use business insight more effectively. Originally known for leadership in enterprise resource planning (ERP) software, SAP has evolved to become a market leader in end-to-end business application software and related services for database, analytics, intelligent technologies, and experience management. As a cloud company with 200 million users and more than 100,000 employees worldwide, we are purpose-driven and future-focused, with a highly collaborative team ethic and commitment to personal development. Whether connecting global industries, people, or platforms, we help ensure every challenge gets the solution it deserves. At SAP, we build breakthroughs, together. We win with inclusion SAP's culture of inclusion, focus on health and well-being, and flexible working models help ensure that everyone - regardless of background - feels included and can run at their best. At SAP, we believe we are made stronger by the unique capabilities and qualities that each person brings to our company, and we invest in our employees to inspire confidence and help everyone realize their full potential. We ultimately believe in unleashing all talent and creating a better and more equitable world. SAP is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to the values of Equal Employment Opportunity and provide accessibility accommodations to applicants with physical and/or mental disabilities. If you are interested in applying for employment with SAP and are in need of accommodation or special assistance to navigate our website or to complete your application, please send an e-mail with your request to Recruiting Operations Team: [email protected]. For SAP employees: Only permanent roles are eligible for the SAP Employee Referral Program, according to the eligibility rules set in the SAP Referral Policy. Specific conditions may apply for roles in Vocational Training. EOE AA M/F/Vet/Disability Qualified applicants will receive consideration for employment without regard to their age, race, religion, national origin, ethnicity, age, gender (including pregnancy, childbirth, et al), sexual orientation, gender identity or expression, protected veteran status, or disability. Compensation Range Transparency: SAPbelieves the value of pay transparency contributes towards an honest and supportive culture and is a significant step toward demonstrating SAP's commitment to pay equity. SAP provides the annualized compensation range inclusive of base salary and variable incentive target for the career level applicable to the posted role. The targeted combined range for this position is 98500 - 167400(CAD) USD.The actual amount to be offered to the successful candidatewill be within that range, dependent upon the key aspects of each case which may include education, skills,experience, scope ofthe role, location, etc. as determinedthrough theselection process. Any SAP variable incentive includes a targeted dollar amount and any actual payout amount is dependent on company and personal performance. Please reference this link for a summary of SAP benefits and eligibility requirements: SAP North America Benefits . Requisition ID: 390727 | Work Area: Software-Development Operations | Expected Travel: 0 - 10% | Career Status: Professional | Employment Type: Regular Full Time | Additional Locations: #LI-HybridRequisition ID: 390727 Posted Date: Mar 22, 2024 Work Area: Software-Development Operations Career Status: Professional Employment Type: Regular Full Time Expected Travel: 0 - 10% Location: Toronto, ON, CA, M5K 1B7
Data Engineer
Four Seasons Hotels and Resorts, Four Seasons Corporate Office Toronto, Any
About Four Seasons:Four Seasons is powered by our people. We are a collective of individuals who crave to become better, to push ourselves to new heights and to treat each other as we wish to be treated in return. Our team members around the world create amazing experiences for our guests, residents, and partners through a commitment to luxury with genuine heart. We know that the best way to enable our people to deliver these exceptional guest experiences is through a world-class employee experience and company culture. At Four Seasons, we believe in recognizing a familiar face, welcoming a new one and treating everyone we meet the way we would want to be treated ourselves. Whether you work with us, stay with us, live with us or discover with us, we believe our purpose is to create impressions that will stay with you for a lifetime. It comes from our belief that life is richer when we truly connect to the people and the world around us.About the location:Four Seasons Hotels and Resorts is a global, luxury hotel management company. We manage over 120 hotels and resorts and 50 private residences in 47 countries around the world and growing. Central to Four Seasons employee experience and social impact programming is the company's commitment to supporting cancer research, and the advancement of diversity, inclusion, equality and belonging at Four Seasons corporate offices and properties worldwide. At Four Seasons, we are powered by people and our culture enables everything we do.Four Seasons Hotels and Resorts is a global, luxury hotel management company. We manage over 120 hotels and resorts and 50 private residences in 47 countries around the world and growing.Central to Four Seasons employee experience and social impact programming is the company's commitment to supporting cancer research, and the advancement of diversity, inclusion, equality and belonging at Four Seasons corporate offices and properties worldwide. At Four Seasons, we are powered by people and our culture enables everything we do.The Data Engineer is a strong technical team member with certifications and experience developing and operating Microsoft Azure and Microsoft Power BI platforms. The role includes both Azure Data Engineer and Power BI Reporting Engineer responsibilities. This role is on a team supporting and managing Four Seasons Azure environments and infrastructure and the growing enterprise portfolio of Power BI dashboards and reports. This tool set includes but is not limited to ADL, ADF, Synapse, Data Bricks, AAS, Cosmos DB, Function Apps, Logic Apps, Power BI, and the Power suite.The role includes responsibilities in the technical Run Operations team, working with Four Seasons team as well as contractors and consultants in steady state operations. With Four Seasons' growth in mind, the role also includes responsibilities to assist in smooth intake and integration of recently-delivered products and sources into efficient day-to-day Run Operations. The role, then, includes active participation in daily Operations stand-up, daily work prioritization, performance and environment monitoring, issue and outage triage, and oversight of resource addressing many ServiceNow queues.The role also includes aspects of technical delivery -- providing key Four Seasons source, infrastructure, data model and KPI knowledge to technical delivery teams as required. The role will also include at times leading their own BI projects and projects teams through requirements collection, build phases, promotion and hyper-care.The candidate is capable of working with the many in our user community, at a breadth of levels, to understand requirements, concerns and issues, and translate and incorporate institutional knowledge into technical solutions. Strong service, communication and problem-solving skills are key.The candidate will work closely with our Manager, IT Data Engineering and Manager, IT Data Operations.This role is based in Four Seasons Hotels and Resorts, Toronto Corporate Office, reporting to the Senior Director, Enterprise Business Intelligence and Sales Applications, IT Operations. This role involves interactions with primarily internal stakeholders at various levels. What You'll Be Doing Data Engineering:Actively participate in and sometimes lead, if required, daily technical BI Operations teams through daily issue and ticket triage, work prioritization and escalation processes.Maintain and monitor the Enterprise BI Production, Dev and Test environments to ensure high availability and quality including proactive performance and resource management.Implement after hours environment maintenance and outage/incident resolution, including communications with vendor partners as required.Proactive oversight of 12+ ServiceNow queues aggregating tickets globally for service levels and opportunity for improvementParticipate in the design and implementation of BI solutions including the design of the Azure data warehouse, building data pipelines, tables, data models, reports.Participate in technical teams through design, build, and implementation for best of breed BI infrastructure for Four Seasons.Participate in and/or lead testing phases to completion.Ensure successful defect resolution with technical and business during unit testing phases.Bring experience and discipline in Dev Ops practices to the role and work.Reporting Engineering:Provide strong Power BI Skills to enable Four Seasons to both create reports in house and enhance existing as required.Engage with technical vendor partner resources on Power BI reporting engagements for Four Seasons' best interests.Assist in growing Four Seasons' Power BI skills, in IT teammates and BU colleagues.Assist in and own as required evaluation of Power BI reporting issues and solution options.BI User Support:Grow and provide solid understanding of Four Seasons institutional knowledge, the Business Units' BI needs and reporting functionality.Respond to user questions regarding enterprise BI dashboards and reports, data and processes.Assist and guide other consulting, IT and BU colleagues and vendors re ticket triage and solutions.Conduct and facilitate meetings to solve day-to-day issues.Enterprise Data Vendor Relationships:Engage with consultants and vendor partners regarding new source, infrastructure and reporting requirements and formats for Four Seasons ingestion and reporting use.Liase with vendors as required for issue, technical or functional support.Liase with vendors to stay current on changes, releases and environment considerations as appropriate.BI Documentation & Process:Collect and document requirements.Build and document Business Intelligence process models, specifications, diagrams and data flow documentation.Determine and document root cause analysis, remediation, issue and outage summaries.Maintain secure deployment guides, architecture and integration diagrams.Ensure IT support materials are kept up to date. Who You Are Excellent analytical, mathematical, and creative problem-solving skills.Excellent listening, interpersonal, written, and oral communication skills.Logical and efficient, with keen attention to detail.Self-motivated and directed.Ability to effectively prioritize and execute tasks while under pressure.Able to exercise independent judgement and take action on it.Strong customer service orientation.Experience working in a team-oriented, collaborative environment.Key Technical SkillsData Engineering in Microsoft Azure Environment + Toolsets (ADLA, ADF, Synapse, Data Bricks, AAS, Cosmos DB, FunctionApps, Logic Apps etc)Microsoft Power BI delivery and supportMicrosoft Power Suite delivery and supportStrong SQL query skillsStrong history in Python, Csharp, Scala, JavaData process improvementWorking in large data setsConfluence, Jira, TestRailHelpdesk Tools and methodsMicrosoft VisioMicrosoft Office Suite, very strong Excel What You Bring 4-5 years of Data Engineering, Visualization and Analytics experience - in Azure environments, Power BI or similar platforms.University or higher degree in Engineering, Mathematics, Information Systems, Computer Science, Technology and/or Data Management or Business Administration with a technical or IT interest.Candidate has preferably completed Azure Fundamentals and Azure Data Engineer Certifications.This role will be a Hybrid working model, which will require 3 days per week in the Four Seasons Corporate Office located at 1165 Leslie Street, Toronto, Ontario #LI-HybridFour Seasons is committed to providing employment accommodation in accordance with the Ontario Human Rights Code and the Accessibility for Ontarians with Disabilities Act. If contacted for an employment opportunity, please advise Human Resources if you require accommodation.Salary: . Date posted: 03/26/2024 09:54 AM
Data Engineer, Ad Finance, Display Finance BI
Amazon, Bangalore, Any, India
BASIC QUALIFICATIONS- 3+ years of data engineering experience- Experience with data modeling, warehousing and building ETL pipelines- Experience with SQLDESCRIPTIONAmazon is looking for a motivated individual with strong database, analytical skills and technology experience to join the Display Ads Finance team.Key job responsibilities- Owning the design, operations and improvements for the Organizations Datawarehouse infrastructure - Maintain, improve and manage all ETL pipelines and clusters- Explore and learn the latest AWS technologies to provide new capabilities and increase efficiency- Define metrics and KPIs to measure success of strategic initiatives and report on their progress- Develop relationships and processes with finance, sales, business operations, solution delivery team, partner, BD, and other cross-functional stakeholders- Help continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers.- Collaborate with data scientists, BIEs and BAs to deliver high quality data architecture and pipelines.A day in the lifeIn this position the successful candidate will be responsible for partnering with Finance and Business leaders to expand and optimize data infrastructure that supports weekly, monthly, quarterly and annual for the Display Ads Finance group and our stakeholders.About the teamThe candidate should enjoy creating pipelines, analyzing data, recommending and implementing solutions to facilitate financial and metrics reporting.- Plan, design, implement, and manage a deployment of self-service data platform & visualization in Quicksight- Create and maintain ETL procedures and SQL queries to bring data in from data warehouse and alternate data sources- Utilize database technologies, including Redshift, AWS EMR and Quicksight to design, develop, and evaluate analyses and highly innovative business intelligence tools and reporting- Scripting language such as Python preferred- Establish scalable, efficient, automated processes for large scale data analyses- Support the development of performance dashboards that encompass key metrics to be reviewed with senior leadership and sales management- Work with business owners and partners to build data sets that answer their specific business questions- Support Financial Analysts, Sales Operations Leads and beyond in analyzing usage data to derive new insights and fuel customer successWe are open to hiring candidates to work out of one of the following locations:Bangalore, KA, INDPREFERRED QUALIFICATIONS- Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions- Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases)Salary: . Date posted: 03/27/2024 10:16 PM
Data Engineer
Discovery, Inc. (Formerly Scripps Networks Interactive), Hyderabad, Any, India
Every great story has a new beginning, and yours starts here. Welcome to Warner Bros. Discovery... the stuff dreams are made of. Who We Are... When we say, "the stuff dreams are made of," we're not just referring to the world of wizards, dragons and superheroes, or even to the wonders of Planet Earth. Behind WBD's vast portfolio of iconic content and beloved brands, are the storytellers bringing our characters to life, the creators bringing them to your living rooms and the dreamers creating what's next...From brilliant creatives, to technology trailblazers, across the globe, WBD offers career defining opportunities, thoughtfully curated benefits, and the tools to explore and grow into your best selves. Here you are supported, here you are celebrated, here you can thrive.Your New Role:This role will work with a fast-paced team to create the data technology stack that can be used to deliver end-user insights used for developing the next generation of digital products and product strategies. You will also help build out an internal data pipeline and develop ways to optimize data technologies to shape our digital data strategy.Your Role Accountabilities: Gain an understanding of brand and enterprise data warehouse/reporting requirements that will inform the data architecture and modeling approach. Employ these learnings to construct appropriate ETL processes, database queries, and data/permission models that support data aggregation and reporting on actionable insights. Passion to write code which is efficient, organized, simple and scalable meeting business requirements. Enthusiasm to learn and find opportunities to enhance and adapt in daily engineering activities is highly desired. Debug, troubleshoot, design, and implement solutions to complex technical issues. Deliver end-to-end JIRA User stories meeting quality expectations of the team. Familiarity with BI Tools and able to create semantic layer models for the business users. Participate in QA testing for data pipeline projects as well as implementation changes to the suite of analytical tools. Monitor batch data loads to meet SLA's. You should be able to quickly respond and resolve production issues.Ability to thrive in a team-based environment and Flexibility to work in second shift.Qualifications & Experiences: Bachelor's degree in computer science, information systems, or information technology.5-8 years of experience in data engineeringKnowledge of supporting data sets for Home Entertainment, Games DVD/Digital business, Content sales and Licensing.Knowledge of SAP supply chain, APO, Order management, Trade spends, promotions, POS (Point of Sale), Royalty, Forecasting and cash collections.Experience with programming languages - SQL, Python, AWS (Amazon Web Services) GlueStrong experience with MPP databases (Teradata & Snowflake)Experience with Snow pipe, tasks, streams, clustering, Time travel, Cache, data sharingExperience in Conceptual/Logical/Physical Data Modelling & expertise in Relational and Dimensional Data ModellingExperience with AWS cloud services - Kinesis, Lambda, IAM (Identity and Access Management) PoliciesExperience in SQL query tuning and cost optimizationExperienced in software delivery through continuous integration (for example git, bitbucket, Jenkins, etc.)Experienced in one or more automation and scheduling tools (for example Redwood, Airflow, etc.)Must be comfortable working in a Linux/Unix environment.Familiarity with ASW Developer tools services like Code Deploy, Data PipelineExperience with public/private API integration, web scrapping, data streaming architectureKnowledge of Business content interchange (BCI). Not Required but preferred experience: Public speaking and presentation skills.Experience with DBT. How We Get Things Done... This last bit is probably the most important! Here at WBD, our guiding principles are the core values by which we operate and are central to how we get things done. You can find them at www.wbd.com/guiding-principles/ along with some insights from the team on what they mean and how they show up in their day to day. We hope they resonate with you and look forward to discussing them during your interview. The Legal Bits... Warner Bros. Discovery embraces the opportunity to build a workforce that reflects the diversity of our society and the world around us. Being an equal opportunity employer means that we take seriously our responsibility to consider qualified candidates on the basis of merit, regardless of sex, gender identity, ethnicity, age, sexual orientation, religion or belief, marital status, pregnancy, parenthood, disability or any other category protected by law.If you're a qualified candidate and you require adjustments or accommodations to search for a job opening or apply for a position, please contact us at [email protected]: . Date posted: 03/27/2024 12:02 PM
Senior Big Data Engineer, GFT
RBC, Toronto, ON
Job SummaryJob DescriptionWhat is the opportunity? Are you a talented, creative, and results-driven professional who thrives on delivering high-performing applications. Come join us!Global Functions Technology (GFT) is part of RBCs Technology and Operations division. GFTs impact is far-reaching as we collaborate with partners from across the company to deliver innovative and transformative IT solutions. Our clients represent Risk, Finance, HR, CAO, Audit, Legal, Compliance, Financial Crime, Capital Markets, Personal and Commercial Banking and Wealth Management. We also lead the development of digital tools and platforms to enhance collaboration.You will be a Senior Software Developer in Big Data area who will be responsible for developing application for large-scale data processing and analysis. You will work with all stakeholders to design best in class technology solutions.We value positive attitude, willingness to learn, open communication, teamwork, and commitment to clean, secure and well-tested code.What will you do? Design, develop, and implement software solutions that meet the organization's strategic goals.Provide technical influence by sharing deep knowledge and experience.Help increase adoption of emerging technology within area of expertise.Liaison with business partners to delivery solution based on clients needs.What do you need to succeed? Must Have 6+ years of experience combined in programming, small to large-scale applications, frontend and backend engineering, test driven development, microservices and architecture design principles.Demonstrated strong team leadership and ability in written and oral communication skills, along with strong presentation skills. Ability to determine the information and communication needs of the stakeholders and project.Expert in multiple programming languages / frameworks such as Hadoop, Apache Spark, Python, Scala, HiveVersion Control (Git)DevOps tools like Jenkins, UCD, Checkmark, HeliosExperience with database engines such as SQL Server, programming experience in writing queries, performance tuningAbility to translate business requirements into technology implementation.Nice to Have Bachelors degree in software engineering or relevantWorking experience with REST API, Json, Postman, or CurlWhats in it for you? We thrive on the challenge to be our best, progressive thinking to keep growing, and working together to deliver trusted advice to help our clients thrive and communities prosper. We care about each other, reaching our potential, making a difference to our communities, and achieving success that is mutual.A comprehensive Total Rewards Program including bonuses and flexible benefits, competitive compensation, commissions, and stock where applicableLeaders who support your development through coaching and managing opportunitiesAbility to make a difference and lasting impactWork in a dynamic, collaborative, progressive, and high-performing teamA world-class training program in financial servicesFlexible work/life balance optionsOpportunities to do challenging work, to take on progressively greater accountabilities, to building close relationships with clientsAccess to a variety of job opportunities across business and geographies.#LI-Hybrid#LI-Post#LI-PK#TECHPJJob SkillsActive Learning, Agile Methodology, Application Integrations, Big Data, Big Data Analytics, Big Data Technologies, Business Systems Analysis, Cloudera Hadoop, Database Development, Detail-Oriented, Enterprise Application Delivery, Hive Query Language, Programming Languages, PySpark, Python (Programming Language), Relational Database Management System (RDBMS), Software Development Life Cycle (SDLC), Software Development Life Cycle (SDLC) MethodologiesAdditional Job DetailsAddress:RBC CENTRE, 155 WELLINGTON ST W:TORONTOCity:TORONTOCountry:CanadaWork hours/week:37.5Employment Type:Full timePlatform:Technology and OperationsJob Type:RegularPay Type:SalariedPosted Date:2024-02-06Application Deadline:2024-04-19Inclusion and Equal Opportunity EmploymentAt RBC, we embrace diversity and inclusion for innovation and growth. We are committed to building inclusive teams and an equitable workplace for our employees to bring their true selves to work. We are taking actions to tackle issues of inequity and systemic bias to support our diverse talent, clients and communities.We also strive to provide an accessible candidate experience for our prospective employees with different abilities. Please let us know if you need any accommodations during the recruitment process.Join our Talent CommunityStay in-the-know about great career opportunities at RBC. Sign up and get customized info on our latest jobs, career tips and Recruitment events that matter to you.Expand your limits and create a new future together at RBC. Find out how we use our passion and drive to enhance the well-being of our clients and communities at jobs.rbc.com.
Big Data Engineer (Python, Kafka, Spark)
NetApp, Bangalore, Any, India
About NetApp We're forward-thinking technology people with heart. We make our own rules, drive our own opportunities, and try to approach every challenge with fresh eyes. Of course, we can't do it alone. We know when to ask for help, collaborate with others, and partner with smart people. We embrace diversity and openness because it's in our DNA. We push limits and reward great ideas. What is your great idea? "At NetApp, we fully embrace and advance a diverse, inclusive global workforce with a culture of belonging that leverages the backgrounds and perspectives of all employees, customers, partners, and communities to foster a higher performing organization." -George Kurian, CEOJob SummaryAs a Software Engineer at NetApp India's R&D division, you will be responsible for the design, development and validation of software for Big Data Engineering across both cloud and on-premises environments. You will be part of a highly skilled technical team named NetApp Active IQ. The Active IQ DataHub platform processes over 10 trillion data points per month that feeds a multi-Petabyte DataLake. The platform is built using Kafka, a serverless platform running on Kubernetes, Spark and various NoSQL databases. This platform enables the use of advanced AI and ML techniques to uncover opportunities to proactively protect and optimize NetApp storage, and then provides the insights and actions to make it happen. We call this "actionable intelligence" You will be working closely with a team of senior software developers and a technical director. You will be responsible for contributing to the design, and development and testing of code. The software applications you build will be used by our internal product teams, partners, and customers. We are looking for a hands-on lead engineer who is familiar with Spark and Scala, Java and/or Python. Any cloud experience is a plus. You should be passionate about learning, be creative and have the ability to work with and mentor junior engineers.Job RequirementsYour Responsibility • Design and build our Big Data Platform, and understand scale, performance and fault-tolerance • Interact with Active IQ engineering teams across geographies to leverage expertise and contribute to the tech community. • Identify the right tools to deliver product features by performing research, POCs and interacting with various open-source forums • Build and deploy products both on-premises and in the cloud • Work on technologies related to NoSQL, SQL and in-memory databases • Develop and implement best-in-class monitoring processes to enable data applications meet SLAs • Should be able to mentor junior engineers technically. • Conduct code reviews to ensure code quality, consistency and best practices adherence.Our Ideal Candidate • You have a deep interest and passion for technology • You love to code. An ideal candidate has a github repo that demonstrates coding proficiency • You have strong problem solving, and excellent communication skills • You are self-driven and motivated with the desire to work in a fast-paced, results-driven agile environment with varied responsibilitiesEducation• 5+ years of Big Data hands-on development experience • Demonstrate up-to-date expertise in Data Engineering, complex data pipeline development. • Design, develop, implement and tune distributed data processing pipelines that process large volumes of data; focusing on scalability, low -latency, and fault-tolerance in every system built • Awareness of Data Governance (Data Quality, Metadata Management, Security, etc.) • Experience with one or more of Python/Java/Scala • Proven, working expertise with Big Data Technologies Hadoop, HDFS, Hive, Spark Scala/Spark, and SQL • Knowledge and experience with Kafka, Storm, Druid, Cassandra or Presto is an added advantageDid you know... Statistics show women apply to jobs only when they're 100% qualified. But no one is 100% qualified. We encourage you to shift the trend and apply anyway! We look forward to hearing from you. Why NetApp? In a world full of generalists, NetApp is a specialist. No one knows how to elevate the world's biggest clouds like NetApp. We are data-driven and empowered to innovate. Trust, integrity, and teamwork all combine to make a difference for our customers, partners, and communities. We expect a healthy work-life balance. Our volunteer time off program is best in class, offering employees 40 hours of paid time off per year to volunteer with their favorite organizations. We provide comprehensive medical, dental, wellness, and vision plans for you and your family. We offer educational assistance, legal services, and access to discounts. We also offer financial savings programs to help you plan for your future. If you run toward knowledge and problem-solving, join us.Salary: . Date posted: 03/21/2024 03:04 PM
Big Data Engineer (Python, Spark, Kafka)
NetApp, Bangalore, Any, India
About NetApp We're forward-thinking technology people with heart. We make our own rules, drive our own opportunities, and try to approach every challenge with fresh eyes. Of course, we can't do it alone. We know when to ask for help, collaborate with others, and partner with smart people. We embrace diversity and openness because it's in our DNA. We push limits and reward great ideas. What is your great idea? "At NetApp, we fully embrace and advance a diverse, inclusive global workforce with a culture of belonging that leverages the backgrounds and perspectives of all employees, customers, partners, and communities to foster a higher performing organization." -George Kurian, CEOJob SummaryAs a Software Engineer at NetApp India's R&D division, you will be responsible for the design, development and validation of software for Big Data Engineering across both cloud and on-premises environments. You will be part of a highly skilled technical team named NetApp Active IQ. The Active IQ DataHub platform processes over 10 trillion data points per month that feeds a multi-Petabyte DataLake. The platform is built using Kafka, a serverless platform running on Kubernetes, Spark and various NoSQL databases. This platform enables the use of advanced AI and ML techniques to uncover opportunities to proactively protect and optimize NetApp storage, and then provides the insights and actions to make it happen. We call this "actionable intelligence" You will be working closely with a team of senior software developers and a technical director. You will be responsible for contributing to the design, and development and testing of code. The software applications you build will be used by our internal product teams, partners, and customers. We are looking for a hands-on lead engineer who is familiar with Spark and Scala, Java and/or Python. Any cloud experience is a plus. You should be passionate about learning, be creative and have the ability to work with and mentor junior engineers.Job Requirements• Design and build our Big Data Platform, and understand scale, performance and fault-tolerance • Interact with Active IQ engineering teams across geographies to leverage expertise and contribute to the tech community. • Identify the right tools to deliver product features by performing research, POCs and interacting with various open-source forums • Build and deploy products both on-premises and in the cloud • Work on technologies related to NoSQL, SQL and in-memory databases • Develop and implement best-in-class monitoring processes to enable data applications meet SLAs • Should be able to mentor junior engineers technically. • Conduct code reviews to ensure code quality, consistency and best practices adherence.Our Ideal Candidate • You have a deep interest and passion for technology • You love to code. An ideal candidate has a github repo that demonstrates coding proficiency • You have strong problem solving, and excellent communication skills • You are self-driven and motivated with the desire to work in a fast-paced, results-driven agile environment with varied responsibilitiesEducation• 5+ years of Big Data hands-on development experience • Demonstrate up-to-date expertise in Data Engineering, complex data pipeline development. • Design, develop, implement and tune distributed data processing pipelines that process large volumes of data; focusing on scalability, low -latency, and fault-tolerance in every system built • Awareness of Data Governance (Data Quality, Metadata Management, Security, etc.) • Experience with one or more of Python/Java/Scala • Proven, working expertise with Big Data Technologies Hadoop, HDFS, Hive, Spark Scala/Spark, and SQL • Knowledge and experience with Kafka, Storm, Druid, Cassandra or Presto is an added advantageDid you know... Statistics show women apply to jobs only when they're 100% qualified. But no one is 100% qualified. We encourage you to shift the trend and apply anyway! We look forward to hearing from you. Why NetApp? In a world full of generalists, NetApp is a specialist. No one knows how to elevate the world's biggest clouds like NetApp. We are data-driven and empowered to innovate. Trust, integrity, and teamwork all combine to make a difference for our customers, partners, and communities. We expect a healthy work-life balance. Our volunteer time off program is best in class, offering employees 40 hours of paid time off per year to volunteer with their favorite organizations. We provide comprehensive medical, dental, wellness, and vision plans for you and your family. We offer educational assistance, legal services, and access to discounts. We also offer financial savings programs to help you plan for your future. If you run toward knowledge and problem-solving, join us.Salary: . Date posted: 03/21/2024 03:04 PM