Menu

Sebastian Rama SEBASTIAN

  • Mercedes-Benz Financial Services France
  • Cloud Architech

Remote

En résumé

System Administrator, Database, Virtualization technologies and high availability with strong technical experience. Wide expertise in O.S in the like of UNIX/Linux. Used to work with SLA , high density of servers environments with big numbers of concurrent users.
Skills:
Cloud Services, PaaS and IaaS (Azure, AWS, GCP)
Support tier L3,L4
Scripting (PowerShell, Python, Bash)
NoSQL MySQL, HBase, Redshift, Cassandra, MongoDB,
Configuration management - Ansible, Chef, Puppet, WSUS
Automation and CI/CD - Terraform, Pulumi, AzureDevOps
OS Flavors - RHEL 6.3 and up - Windows Server
ETL - Apache NiFi, DataFactory, SSIS
Analytics - Qlikview, Tableau, Alterix,, Spotfire, PowerBI.
Hadoop - Cloudera, Hortonworks, HIve, Spark, Kafka, HDFS, Oozie, Hbase ,Ranger, Knox.

Specialties: Linux RHEL- Windows Server - Networking - Firewalls - LAN - WAN - TCP/IP
VLANs - CI/CD - SELinux - Apache - OpenLdap - Active Directory - SQL - Virtualizations - Webservices - KVM.

Entreprises

  • Mercedes-Benz Financial Services France - Cloud Architech

    Informatique | Remote 2023 - maintenant MB.CE: Mercedes-Benz Cloud Experiencie
    With MB.CE we provide harmonized, nondomain-specific Cloud Services and processes via a managed marketplace for Mercedes-Benz
    MB.CE – Vision
    One, harmonized offering for non-business differentiating problems
    Drive business by focusing on custom business differentiating solutions
    Effective use of precious resources
    Unified, frictionless experience of a platform
    Open ecosystem mindset
    Focus on API first, automation & self-service
    Foster a mindset of openness to integrate products of 3rd parties
    Ensure the core assets (APIs, Events,Data) are available & consistently published
    Build a managed ecosystem that allows 3rd parties to enrich core products of platform
    Integrating services like DataDog
    Building POC for SREMB.CE: Mercedes-Benz Cloud Experiencie With MB.CE we provide harmonized, nondomain-specific Cloud Services and processes via a managed marketplace for Mercedes-Benz MB.CE – Vision One, harmonized offering for non-business differentiating problems Drive business by focusing on custom business differentiating solutions Effective use of precious resources Unified, frictionless experience of a platform Open ecosystem mindset Focus on API first, automation & self-service Foster a mindset of openness to integrate products of 3rd parties Ensure the core assets (APIs, Events,Data) are available & consistently published Build a managed ecosystem that allows 3rd parties to enrich core products of platform Integrating services like DataDog Building POC for SRE
    Skills: Cloud Computing · Amazon Web Services (AWS) · Microsoft Azure
  • Cloudera France SARL - Solutions Architech

    Technique | Remote 2022 - 2023 Motivate and enable our customers on their Enterprise Data Cloud journey
    Participate in the pre and post sales process, helping both the sales and product teams to interpret customer use cases
    Drive POCs with customers to successful completion
    Design and implement Enterprise Data Cloud architectures and configurations for customers
    Work directly with customer technical resources to devise, recommend and implement solutions based on the understood requirements
    Recommend best practice design patterns for distributed data pipelines and analytical computing architectures
    Plan and deliver presentations and workshops to customer/internal stakeholders
    Mentor junior consultants
    Assist in the technical hiring process
    Write and produce technical documentation, blogs and knowledgebase articles
    Work directly with customer’s technical resources to devise and recommend solutions based on the understood requirements
    Analyze complex distributed production deployments, and make recommendations to optimize performance
    Able to document and present complex architectures for the customers technical teams
    Work closely with Cloudera’s teams at all levels to help ensure the success of project consulting engagements with customer
    Help design and implement Hadoop architectures and configurations for customer
    Drive projects with customers to successful completion
    Write and produce technical documentation, knowledge base articles
    Participate in the pre-and post- sales process, helping both the sales and product teams to interpret customers’ requirements.
    Keep current with the Hadoop Big Data ecosystem technologies.
    Attend speaking engagements when needed.Motivate and enable our customers on their Enterprise Data Cloud journey Participate in the pre and post sales process, helping both the sales and product teams to interpret customer use cases Drive POCs with customers to successful completion Design and implement Enterprise Data Cloud architectures and configurations for customers Work directly with customer technical resources to devise, recommend and implement solutions based on the understood requirements Recommend best practice design patterns for distributed data pipelines and analytical computing architectures Plan and deliver presentations and workshops to customer/internal stakeholders Mentor junior consultants Assist in the technical hiring process Write and produce technical documentation, blogs and knowledgebase articles Work directly with customer’s technical resources to devise and recommend solutions based on the understood requirements Analyze complex distributed production deployments, and make recommendations to optimize performance Able to document and present complex architectures for the customers technical teams Work closely with Cloudera’s teams at all levels to help ensure the success of project consulting engagements with customer Help design and implement Hadoop architectures and configurations for customer Drive projects with customers to successful completion Write and produce technical documentation, knowledge base articles Participate in the pre-and post- sales process, helping both the sales and product teams to interpret customers’ requirements. Keep current with the Hadoop Big Data ecosystem technologies. Attend speaking engagements when needed.
  • Telenet - Big Data Platform Engineer

    Informatique | remote 2020 - 2022 Migration of the current Cloudera CDH to CDP to AWS. -Deploy 5 new cluster for the upcoming demands.
    - Support the Big Data platform (Spark, Hive, etc.)
    - Linux, Kerberos, scripting with Bash & Python
    - DevOps approach with Git and Ansible to manage clusters

    Existing and upcoming Data technologies : Hadoop, Scala, Spark, Kafka, SQL, Flink, Cassandra, ELK,
    Designing and building cloud-based data solutions and more specifically with AWS (IAM, S3, Glue, EMR, EKS, Sagemaker, …).
    Data integration/movement platform specialist with experience in performing following activities:

    NiFi: Establishing enterprise-level framework: automating intake procedure, promotion and smaller components (GitLab)
    Establishing monitoring and alerting solutions on cluster level with Open Source solutions.
    CFM (Cloudera NiFi) Implementation.
    Setting up E2E NiFi environments on the cloud

    Deployment of Cloudera Private Cloud Base
    Establishing and Automating enterprise-level framework. CDP 7.x
    Migration of CDH workloads. Cluster Management, upgrades, expansions,etc.

    Kafka:
    E2E setting up Kafka environments (Brokers, Connect, SchemaRegistry, Kerberos, MirrorMaker)
    Setting up monitoring (Lenses, Prometheus, Grafana, ELK …)
    Setting up security (SASL/GSSAPI, SSL)
    Kubernetes deployments of Kafka component
    Setting up E2E Kafka environments ( Brokers,Connect,SchemaRegistry,Kerberos, mTLS,MirroMaker) on the cloud (AWS)

    Technology stack:
    - Kafka Stack, NiFi, Kerberos, Openshift, Hadoop Stack, Kudu, Impala, Python, Ranger, Knox, Gitlab
    Prometheus + Grafana, Terraform, AWS and GCP .Migration of the current Cloudera CDH to CDP to AWS. -Deploy 5 new cluster for the upcoming demands. - Support the Big Data platform (Spark, Hive, etc.) - Linux, Kerberos, scripting with Bash & Python - DevOps approach with Git and Ansible to manage clusters Existing and upcoming Data technologies : Hadoop, Scala, Spark, Kafka, SQL, Flink, Cassandra, ELK, Designing and building cloud-based data solutions and more specifically with AWS (IAM, S3, Glue, EMR, EKS, Sagemaker, …). Data integration/movement platform specialist with experience in performing following activities: NiFi: Establishing enterprise-level framework: automating intake procedure, promotion and smaller components (GitLab) Establishing monitoring and alerting solutions on cluster level with Open Source solutions. CFM (Cloudera NiFi) Implementation. Setting up E2E NiFi environments on the cloud Deployment of Cloudera Private Cloud Base Establishing and Automating enterprise-level framework. CDP 7.x Migration of CDH workloads. Cluster Management, upgrades, expansions,etc. Kafka: E2E setting up Kafka environments (Brokers, Connect, SchemaRegistry, Kerberos, MirrorMaker) Setting up monitoring (Lenses, Prometheus, Grafana, ELK …) Setting up security (SASL/GSSAPI, SSL) Kubernetes deployments of Kafka component Setting up E2E Kafka environments ( Brokers,Connect,SchemaRegistry,Kerberos, mTLS,MirroMaker) on the cloud (AWS) Technology stack: - Kafka Stack, NiFi, Kerberos, Openshift, Hadoop Stack, Kudu, Impala, Python, Ranger, Knox, Gitlab Prometheus + Grafana, Terraform, AWS and GCP .
    Skills: Big Data Analytics

Formations

Pas de formation renseignée

Réseau

Pas de contact professionnel

Annuaire des membres :