To ensure the best candidate experience, we recommend applying for a maximum of 3 roles within 12 months to avoid duplicating efforts.
Job Category
Software Engineering
Job Details
About Salesforce
We're Salesforce, the Customer Company, inspiring the future of business with AI+ Data +CRM. Our core values lead the way, helping companies in all industries blaze new trails and connect with customers in novel ways. We also empower you to be a Trailblazer — driving your performance and career growth, charting new paths, and improving the state of the world. If you believe in business as the greatest platform for change and in companies doing well by doing good – you've located us at the right time.
Org Overview
The Data and Analytics Organization is Salesforce's crucial element to promote growth and margins through unparalleled data insights. From robust governance to strategic execution, we support data pioneers with an unbiased approach. Our enterprise data strategy lays a solid data foundation, fostering a culture of data-driven decisions. We ensure quality from start to finish with a cohesive data supply chain. Through the deployment and integration of platform tools, we facilitate seamless data access and automated data management, driving efficiency and growth with actionable insights. As a reliable partner, we create a data ecosystem that fuels innovation. Our commitment to integrity and accessibility propels informed decision-making, taking Salesforce to new pinnacles of excellence.
Team Overview
The Data Strategy and Management Engineering team brings data to life by pairing with data producers and platform engineers to empower data consumers (data scientists, data analysts, and visualization engineers) who consume data for business analytics and AI augmented solutions. We do this by offering trusted data in an agile manner, making it accessible for a range of use cases. We take pride in our data curiosity and work on architecting, automating, and scaling our data curation frameworks, services, and processes to rapidly integrate disconnected and disparate raw data into business-relevant assets, all in the name of customer success.
Responsibilities:
1. Design efficient and scalable data pipelines for collecting, transforming, and loading data from various sources.
2. Implement error handling and monitoring mechanisms to ensure data quality and pipeline reliability.
3. Partner with data producers to understand data sources, enable data contracts, and define the data model driving analytical use cases.
4. Optimize data storage solutions and implement strategies for query performance, cost, and scalability.
5. Monitor and enhance the performance, availability, and scalability of data pipelines, addressing bottlenecks and latency issues.
6. Ensure data security and compliance with relevant regulations (e.g., GDPR), implement data masking, access control, and other data protection measures.
7. Have subject matter expertise in the solution and connect work with business impact.
8. Collaborate with cross-functional teams, provide technical guidance, and mentor junior engineers.
9. Evaluate various open-source technologies and platforms. Execute proof of concept on new technology and tools to pick the most suitable tools and solutions as needed.
Requirements:
1. B.S/M.S. in Computer Sciences or equivalent experience in big data engineering, data acquisition, and integration projects.
2. 5+ years of experience designing, implementing, and maintaining relational / data warehousing environments (preferably involving large data environments).
3. Strong background in Data Warehousing concepts and schema design.
4. Proficiency in programming languages used in data engineering, such as Python, SQL, and big data technologies like Hadoop, Spark, Kafka, and distributed computing frameworks.
5. In-depth understanding of data modeling, lake house/data mesh technologies, and proficiency in building frameworks and data pipelines.
6. Strong problem-solving skills and the ability to troubleshoot complex data-related issues with a prime focus on data quality and management.
7. Excellent communication skills to collaborate with technical and non-technical stakeholders.
8. A beginner's and continuous improvement mindset.
Preferred:
1. Knowledge of Salesforce products and working with Salesforce metadata.
2. Experience with Public Cloud platforms like GCP, AWS, Snowflake.
3. Familiarity with production debugging techniques.
Accommodations
If you require assistance due to a disability when applying for open positions, please submit a request via this Accommodations Request Form.
Posting Statement
At Salesforce, we believe that the business's business is to improve our world's state. Each of us has a responsibility to drive Equality in our communities and workplaces. We are committed to creating a workforce that reflects society through inclusive programs and initiatives such as equal pay, employee resource groups, inclusive benefits, and more. Learn more about Equality at www.equality.com and explore our company benefits at www.salesforcebenefits.com.
Salesforce is an Equal Employment Opportunity and Affirmative Action Employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender perception or identity, national origin, age, marital status, protected veteran status, or disability status. Salesforce does not accept unsolicited headhunter and agency resumes. Salesforce will not pay any third-party agency or company without a signed agreement with Salesforce.
Salesforce welcomes everyone.