Job description
About TechBiz Global
TechBiz Global is a leading recruitment and software development company. Our diverse, globally distributed team provides IT recruitment, outstaffing, outsourcing, software development, and different consulting services with a primary focus on making our partners achieve their business goals successfully.
With headquarters in Germany we have successful clients all over the world. We can understand your unique needs. Our team has hands-on experience with the challenges that come with rapid growth and the IT sector. That’s why all of our offerings are built with a tech mindset.
Description:
As a Technical Solutions Architect , you will collaborate with customers to design scalable data architectures utilizing Databricks technology and services. Leveraging your technical expertise and business acumen, you will navigate complex technology discussions, showcasing the value of the Databricks platform throughout the sales process. Working alongside Account Executives, you will engage with customers' technical leaders, including architects, engineers, and operations teams, aiming to become a trusted advisor who delivers concrete outcomes. You will liaise with teams across Databricks and executive leadership to advocate for your customers' needs and foster valuable engagements.
The impact you will have:
Develop Account Strategies: Work with Sales and other essential partners to develop strategies for your assigned accounts to grow their usage of the Databricks platform.
Establish Architecture Standards: Establish the Databricks Lakehouse architecture as thestandard data architecture for customers through excellent technical account planning.
Demonstrate Value: Build and present reference architectures and demo applications to help prospects understand how Databricks can be used to achieve their goals and land new use cases.
Capture Technical Wins: Consult on big data architectures, data engineering pipelines, and data science/machine learning projects to prove out Databricks technology for strategic customer projects. Validate integrations with cloud services and other third-party applications.
Promote Open-Source Projects: Become an expert in and promote Databricks-inspired open-source projects (Spark, Delta Lake, MLflow) across developer communities through meetups, conferences, and webinars.
Job requirements
What we look for:
Minimum qualifications:
Educational Background: Degree in a quantitative discipline (Computer Science, Applied Mathematics, Operations Research).
Experience: Total + 12 Years with minimum 5+ years in a customer-facing pre-sales, technical architecture, or consulting role with expertise in at least one of the following technologies:
Minimum 2 years working on Databricks platform is a must
Big data engineering (e.g., Spark, Hadoop, Kafka)
Data Warehousing & ETL (e.g., SQL, OLTP/OLAP/DSS)
Data Science and Machine Learning (e.g., pandas, scikit-learn, HPO)
Data Applications (e.g., Logs Analysis, Threat Detection, Real-time Systems Monitoring, Risk
Analysis)
Technical Expertise:
Experience translating a customer's business needs to technology solutions, including establishing buy-in with essential customer stakeholders at all levels of the business.
Experienced at designing, architecting, and presenting data systems for customers and managing the delivery of production solutions of those data architectures.
SQL Proficiency: Fluent in SQL and database technology.
Development Languages: Debug and development experience in at least one of the following languages: Python, Scala, Java, or R.
Cloud Experience (Desired): Built solutions with public cloud providers Azure(mandate) . AWS or GCP (Good to have)
Preferred qualifications:
Certifications (Highly Preferred):
Databricks Professional Certifications:
Data Engineer (Professional), ML Engineer (Professional)
Technical certifications such as Azure Solutions Architect Expert, AWS Certified Data Analytics,
DASCA Big Data Engineering and Analytics, AWS Certified Cloud Practitioner, Solutions Architect, and Professional Google Cloud Certified.
Specific Expertise (Highly Preferred): Technical expertise in at least one of the following areas:
Generative AI / AI/ML
Unity Catalog
Databricks Data Warehousing
or
All done!
Your application has been successfully submitted!