Posts

Build JupyterNotebook(Anaconda) environment in AWS EC2

Key words: JupyterNotebook, Anaconda, AWS, EC2, Proxy, Winscp, Putty, DeepLearning <Background> The company employs a proxy server, through which all external connections must pass. Company computers utilize Virtual Desktop Infrastructure (VDI), restricting the installation of large software such as Anaconda. <Task> As a Deep Learning engineer, it is necessary to set up a Jupyter Notebook environment on AWS cloud. However, services like SageMaker are costly, thus requiring a self-built environment on AWS EC2. <AWS EC2 Configuration> Create an EC2 Instance with the following specifications: AMI: Ubuntu/Amazon Linux 2 AMI CPU: t3.large VPC: Public subnet with an Internet Gateway (IGW) Elastic IP: Required, as dynamic IP addresses are prone to firewall interception. Security Group: Inbound ports 8081 (Jupyter) and 22 (SSH) should be open; all outbound ports should be accessible. Storage: Minimum of 16GB, as Anaconda alone requires 9GB. Key Pair: Generate an RSA key pair,

AWS Notes - Network - TransferFamily(Transfer) connect with EFS

1. AWS EC2 connects to EFS create EFS and create EC2 EFS's DNS:fs-XXXXXXX.efs.ap-northeast-1.amazonaws.com add security group add nfs port(2049) to EFS inbound rules (IP: EC2's local IP) use the command below to mound the EFS to EC2 a. install efs tools sudo yum install amazon-efs-utils b.mount the EFS sudo mount -t efs -o tls fs-XXXXX:/ ~/efs-mount 2. Create Transfer Create transferFamily's Endpoint Protocols: SFTP Identity Provider: Service Managed Endpoint : s-XXXXXXXXX.server.transfer.ap-northeast-1.amazonaws.com Endpoint Type: Public 3. Create User to connect Transfer to EFS Create a role to connect transfer to EFS a.Create Policies { "Version": "2012-10-17", "Statement": [ { "Sid": "GrantTransferRoleAccess", "Effect": "Allow", "Action": [ "elasticfilesystem:ClientRootAccess", "elasticfilesystem:ClientWrite", "elasticfilesystem:ClientMount" ], "Resource

AWS Notes - DevOps - OpsWorks

AWS OpsWorks = Managed Chef & Puppet Feature Chef & Puppet help you perform server configuration automatically, or repetitive actions. Work Great with EC2 & On-premise VM Better for users of Chef & Pupper, Not good to migrating from other tech. Chef has more detailed configuration than Beanstalk

AWS Notes - Compute - Elastic Beanstalk

Features PaaS Service Support Blue/Green Deployment (Use Route53) No additional charge for Elastic Beanstalk, Just pay for underlying AWS resources the application consumes. 3 Architecture Models Single Instance: For Dev LB+ASG: For Prod Web App ASG only: For non-web apps in Prod 2 AWS IAM roles: Service role: Assumed by Elastic Beanstalk to use other AWS services on your behalf Instance profile: applied to the instances in your environment and allows them to retrieve information, perform tasks that vary depending on the environment type and platform. Reference:  https://tutorialsdojo.com/aws-elastic-beanstalk/ Udemy: https://www.udemy.com/course/aws-solutions-architect-professional/

AWS Notes - Migrations - On-premises strategy with AWS

Application Discovery Service Gather information of on-premise servers to plan a migration Track with Migration Hub Server utilization and dependency mappings Application Migration Service Replacing SMS(Server Migration Services) Incremental replication of on-premises live servers to AWS Migrate the entire VM into AWS Elastic Disaster Recovery Replacing CloudEndure Disaster Recovery Recover on-premises workloads onto AWS Database Migration Service Reference Udemy: https://www.udemy.com/course/aws-solutions-architect-professional/

AWS Notes - Migrations - DR

RPO and RTO RPO: Recovery Point Objective (Which point to recovery) RTO: Recovery Time Objective (How many time cost to recovery) Disaster Recovery Strategies Backup and Restore (Snapshot, Storage Gateway etc.) Pilot Light (DB(replication) running, EC2 not running) Warm Stanby  (Fastest) Hot Site / Multi Site Approach Reference Udemy: https://www.udemy.com/course/aws-solutions-architect-professional/

AWS Notes - Migrations - DMS, SCT, CART

  DMS (Database Migration Service)  Type: Homogeneous migrations (Oracle to Oracle) Heterogeneous migrations (Microsoft to Aurora) Features An EC2 instance should be created to perform the replication tasks Works over VPC Peering, VPN, Direct Connect The source database remains available duiring the migration Supports : Full Load, Full Load + CDC(Change Data Capture), CDC SCT(Schema Conversion Tool) Features Used for Heterogeneous database migrations Can scan the application Source for embedded SQL, and convert them as part of a database schema conversion project DMS+Snowball SCT to extract the data locally and move it to an Edge device Send to S3 Use the DMS to migrate the data to the target data store CART(Cloud Adoption Readiness Tool) An assessment report by answer the question of Business, People, Process, Platform, Operations, Security Reference https://en.wikipedia.org/wiki/Change_data_capture