Muhammad

Senior Sales Engineer
MaleSales representativeLive in IndonesiaNationality
Share

Work experience

  • Senior Sales Engineer

    Snowflake
    2022.07-Current(4 years)
    (3 years 4 months) • Technology Evangelism & Pre-Sales Support • Present Snowflake’s technology and vision to executives, technical teams, and business stakeholders. • Conduct in-depth demonstrations, Proof of Concepts (PoCs), and hands-on workshops to showcase Snowflake’s capabilities. • Provide strategic guidance to customers on adopting Snowflake for data warehousing, data lakes, data engineering, and AI/ML use cases. • Solution Architecture & Implementation • Design and develop Snowflake Reference Architectures, including: • Data ingestion strategies (batch & real-time) using ETL tools, Kafka, Snowpipe, Snowpipe Streaming, Apache Nifi. • Optimized data modeling for performance, scalability, and cost efficiency. • Advanced data transformation leveraging Streams, Tasks, Stored Procedures, and Snowpark. • Seamless integration with third-party tools such as Tableau, Dataiku, Power BI, AWS QuickSight, and AWS SageMaker. • Data governance best practices, including access control, RBAC, masking, and tagging. • Cost governance & optimization strategies for efficient Snowflake usage. • Customer Engagement & Advisory • Work directly with customers to define technical requirements and recommend best practices for migrating workloads to Snowflake (on-prem tocloud, multi-cloud strategies). • Guide clients on performance tuning, workload optimization, and cost- efficient resource allocation. • Collaborate with Product, Engineering, and Marketing teams to continuously refine Snowflake’s solutions and improve customer experience. • Mentorship & Knowledge Sharing • Mentor for the Snowmaker Program, guiding new Snowflake professionals in developing expertise in Snowflake architecture, best practices, and real-world implementation. • Conduct internal training sessions and enablement programs to support Snowflake partners and new hires. • Deliver webinars, workshops, and technical sessions to educate customers on Snowflake’s latest innovations.
  • Senior Solution Engineer

    Cloudera
    2019.11-2022.06(3 years)
    (2 years 8 months) # Engage with sales prospects , prepare and present Cloudera’s technology along with addressing any technical questions that come up during the sales cycle # Work directly with prospective customers’ technical resources to devise and recommend solutions based on the understood requirements, example create recommendations solutions for customer who want to move their workload from on premise cluster to cloud ( AWS or Azure ) # Work with customers, partners, and prospects to understand and propose solutions using Cloudera technologies ( Cloudera on Premise, Cloudera onPrivate Cloud and Cloudera On Cloud ) # Support the technical needs of existing customers including discovering additional use cases for Cloudera’s technology # Work closely with the product management and engineering team to align customer requirements to future versions and products # Have a deep understanding of Cloudera’s technology and a willingness to learn new technologies when needed # Manage technical sales process for numerous customers day to day # Continuously learn and update skills in quickly evolving technologies # Represent Cloudera Professional Services while on site with clients by demonstrating subject matter expertise in the fields of big data and data modernization ( Presenting or showing Demo about Cloudera Datawarehouse or Cloudera Machine Learning or Cloudera Operational Database or Cloudera Data Flow ) # Analyze complex distributed production deployments, make recommendations, and implement Cloudera solutions to optimize performance # Help develop reference Hadoop architectures and configurations
  • Head of Data Engineer

    Tiket.com (PT. Global Tiket Network)
    2019.08-2019.11(4 months)
    # Lead three teams that under data engineer division, and there are # Data Engineer # Machine Learning Engineer # Data Platform # Drive development of data movement from data source into our google cloud platform ( cloud data storage , bigQuery ) using apache Nifi # Manage data engineer, data platform and machine learning engineer working to build, scale, and deploy all data solutions # Partner closely with Product, Engineering, Operation, Marketing, Finance and Business teams # Help team members and managers develop their careers, assigning them to projects tailored to their skill levels, long-term skill development, personalities, and work styles # Work closely with dedicated recruiting staff to expand the team, including sourcing candidates, interviewing candidates.
  • Lead Data Engineer

    OVO (PT Visionet Internasional)
    2017.07-2019.08(2 years)
    (2 years 2 months) Lead the big data engineer team by monitoring and supervise all of the jobs that big data engineer team do. Manage Resource Allocation. Designed and Implemented ETL processes with Informatica BDM to load the data from various sources ( Json, API, NoSQL, RDBMS, CSV ) into Hadoop Cluster and kinetica ( database on memory ) Provide solution architecture that related to data ingestion process Perform root cause analysis on all processes and resolve all production issues. Involve in Complex Event Processing Projects, provide solution architect for all Activity that relates to complex event processing. Implement complex event processing using Spark Streaming, Kinetica ( Database in memory ), Kafka, Hive, and Informatica Big Data Streaming.
  • Technical Consultant, Professional Services

    Teradata
    2016.05-2017.07(a year)
    (1 year 3 months) #Participating in Maybank Indonesia Datawarehouse Project Analysis of source systems and work with business analysts to identify study and understand requirements and translate them into technical specification & design documents. Developed stored procedures, functions, Views, Complex queries using Teradata. Involved in unit testing, Integration testing and User acceptance testing. #Participating on XL DWHR Project ( as Hadoop Consultant ) 1. Create the New Capacity Scheduler 2. Create a Review based on performance test that we do ( Abinitoperformance that running on top of Hadoop Cluster ) # Participating on Directorat of General Taxes Project ( as Big Data Engineer ) 1. Trying to Solve some several use case :: a. Create a community between tax payer based on their relationship b. To find which tax payer that always submit “Lost” and “ Over Paid” at their annualy tax report. At least in 3 consecutive times. c. To find the pattern activity of some suspected tax payer. d. Create the graph based on tree decisions # trying to find how many child of some big company had. 2. # Participating on Telkomsel Race Project ( as Big Data Engineer ) 1. Installing Aster on top of 66 nodes hadoop system ( cloudera 5.7 ) 2. Load the data from hadoop to aster using query grid 3. Trying to Solve some several use case :: a. Top e commerce that telkomsel subsriber visits b. To find the telkomsel user mobility c. Create a community or group based on telkomsel subscriber activity When they are using the internet.
  • IT Consultant

    IBM
    2015.05-2016.05(a year)
    (1 year 1 month) Analysis of source systems and work with business analysts to identify study and understand requirements and translate them into technical specification & design documents. Implemented ETL processes with datastage to load the data from various sources into the Data warehouse. Developed stored procedures, functions, Views, Complex queries using Oracle PL/SQL. Involved in unit testing, Integration testing and User acceptance testing. Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau. Developed Tableau visualizations and dashboards using Tableau Desktop.
  • Service Architecture

    PT. XL Axiata Tbk
    2015.03-2015.05(3 months)
    3 years 3 months Participate in process flow analysis and process redesign, Produce a detailed functional design document to match customer requirements, Produce a technical specification for custom development and systems integration requirements
  • Engineer - EnterpriseDataWarehouse

    PT. XL Axiata Tbk
    2012.03-2015.04(3 years)
    (3 years 2 months) Designed and implemented ETL processes with Ab Initio to load the data from various sources into the Data warehouse. Developed stored procedures, functions, Views, Complex queries using Oracle PL/SQL and Teradata. Perform analysis on quality and source of data to determine accuracy of information being reported.

Educational experience

  • Telkom Institute of Technology

    Information Technology
    2007.01-2011.01(4 years)

Certificates

Hands On Essentials - Data Sharing Hands On Essentials - Data Warehouse SnowPro Core Certification SnowPro Advanced: Data Engineer
Resume Search
Nationality
Job category
City or country
Jobs
Candidates
Blog
Me