Archive for April, 2017

Passed AWS SA Pro!

April 27, 2017

Screen Shot 2017-04-27 at 11.02.40 PM

After just over a month(1 week full-time) of serious studying and stressing like crazy, I’ve passed my AWS SA Pro exam. Having received the exam reminder in October 2016, I did what most developers would do and left it to the last. The last day to be exact. Adding ample pressure to get it right the first time. I mean, who wants to write the Associate exam again?!

Google was my friend. First have a read at other more qualified bloggers than myself:

  1. http://jayendrapatil.com/
  2. https://medium.com/@Miha.Kralj/passing-the-aws-certified-solutions-architect-professional-exam-ebbdc26d6598
  3. https://gabrielrojas.nyc/2017/04/01/tips-on-how-to-get-all-5-aws-certifications-in-1-month/

Have a google and there would be plenty. I wont repeat everything they’ve said. Just a few notes for those looking to get certified soon, maybe it helps, maybe not.

Also, if you have the money, acloudguru.com($99), and definitely linuxacademy.com($29/m) is worth a look. If I had to choose one, it would be linuxacademy.com. (free 7 day trial 🙂

My two cents is on time management. Before I started the exam I just made a column on my test paper to keep time with:

Question Time
10 2:35
20 2:20
30 2:05
40 1:50
50 1:35
60 1:20
70 1:05
80 50

You really have no time to waste! And rather move on than waste even an extra minute. I’m not the quickest reader, but I couldn’t believe I didn’t make a second pass on that exam. That rarely happens with an online MCQ exam.

Also, a month part-time is not enough. I was eventually forced into taking a week leave to study full time. As I had a feeling the content was to much. It is not difficult, just the amount of material is pretty hectic.

I hope that helps someone else, with enough time, to make better choices than I did and keep some of their hair intact. 🙂

 

Advertisements

DMS instead of Datapipeline

April 27, 2017

In a previous post I detailed my trials and fails with using Datapipeline to archive Aurora tables to Redshift. This led to a comment from rjhintz about using AWS DMS instead. I initially went with Datapipeline because we would eventually truncate the source table and would not want that replicated to Redshift, deleting the archive data. But I would still take some timeout to checkout the DMS service.

AWS DMS initial use case is to help people migrate to AWS from an on-premise DB installations. As it says in the name 😉 My use case would be for archiving. We initially used Datapipeline to achieve this. The setup was pretty tedious! To say the least. Once it is up and running, we still have to check that the jobs have completed correctly and that nothing has gone wrong.

This weekly checking had become a chore. This is where DMS comes in. We only have one table that needs truncating, whereas all it’s related tables simply need to be in-sync. It took a day to get up to speed with DMS, after that it we migrated all our Datapipelines except for one to DMS.

Create an instance that has access to your source and target databases. This was needed in Datapipeline as well. It’s just much much easier in DMS. No AMIs needed with larger instance stores.

Screen Shot 2016-08-13 at 10.41.56 PM.png

Create your database endpoints.

Create a task that will migrate and then keep in sync your desired tables.

Screen Shot 2016-08-13 at 10.44.15 PM.png

That’s it. Done.