My experience with AWS Certified Solutions Architect – Professional (SAP-C01) exam – June 2019

Checkout my new updated website at https://ebinissac.me

On June 12, 2019, I passed my fourth AWS certification – AWS Certified Solutions Architect – Professional. I would like to share my experience with it.

The last time I used AWS professionally was on 2017 December. Ever since I joined Accenture, I was working on some private cloud, and never really had a chance to work on AWS. However, I always had this urge to get another cert to be up to date and also to get into an AWS project when I get the chance. However, it was only on Jan 2019 that I finally made that decision to start preparing for the exam. Here is my timeline on my studies.

1) Accenture provided me access to acloud.guru subscription. I watched every single video one time, did not really took any notes. I did the quiz presented after every lesson. Also did some of the labs.

2) Once the videos are completed, I read all the whitepapers mentioned during the lessions. And it was really a difficult task. They were super long, and boring af. I fell asleep a number of times reading them. However, they really gave a lot of insights about the best practices.

3) Acloud Guru also has linked a number of Re-Invent videos in their lessons. I watched every one of them. They were one hour long, however I found them very interesting. It also gave a lot of insights.

4) I did the exam simulator in Acloud Guru, and failed miserably. I was expecting it any way, but wanted to give a shot.

5) I bought the practice exams from Whizlabs, did every set of questions once, and failed all of them. I was expecting that also. However, doing the practice exams actually trains to think the AWS exam way. Also there were so many mistakes in the Whizlabs exams, some of their explanations did not make any sense to me, so I actually read the documentation myself and sent them links proving that they were wrong. This was good for me because it introduced me to some new topics that I would not have looked into otherwise.

6) I watched most of the videos from Acloud guru again, and did their practice test, and passed comfortably.

7) Attempted Whizlabs again, some barely passed, and others failed by short margin. Again it was clear that I was not prepared yet.

8) Bought practice exams from TutorialsDojo and attempted them. Passed all of them in the first attempt, but I still read through every questions and their explanations.

9) At this point I started feeling confidence, so I attempted the official AWS practice exam, and passed comfortaby with 85%. So I gave myself another 2 weeks to wrap up and attempt the final exam.

10) During the last two weeks, I subscribed for Linux Academy trial, watched some lessons that I felt that I am not comfortable with, and did some labs.

11) Attempted Whizlabs exams- scored over 85%, attempted tutorialsdojo exams, scored over 90%, and attempted the AWS offical exam again, also scored 85%.

12) However, the final exam was a totally different level that most of the practice exams. It was so difficult that I was almost sure that I was about to fail. But in the end passed with a score of 826 – guess a lot of my guesses were right.

In terms of my resources:

  • Acloud Guru – good videos, but not everything is covered. The exam simulator is good. Would have been great if they provide lab access also.
  • Linux Academy – I felt that comparing to Acloud guru, the videos are better here. However I think the contents are not up to date for the new exam. Their real value lies in their hands on labs. That was great.
  • AWS Whitepapers – Boring. But if you can survive, they are the best
  • AWS Re-Invent – Really good if you enjoy watching these kind of stuffs. I felt exciting watching them.
  • Whizlabs – Very bad grammar and a lot of mistakes in questions. Their support does not even read what we have asked, they just reply with some random answer. However, if you can exclude these, their exams are good. Around 50% questions give very good explanations and introduce to newer concepts. Some questions are garbage.
  • TutorialsDojo – Jon Bonso is responsive, and replies to queries. There are some good questions and every question has very details explanations, which is really good. However the exams are very easy compared to the actual exam.

I think in the end, I would say that all of these helped. I spent around 6 months doing all these, sacrificing a lot of my weekends and social life. However I really feel that it was worth it.

Whats’s next ? Probably not AWS.

How to use sudo in powershell

Well this is not exactly same as sudo.

However it is very annoying when I want to run something in powershell and it gives me access denied. And then  I need to go to start menu, or task bar, right click, select run as administrator etc just to run one command. As someone who likes to use commands over GUI, had to find a way to improve this.

The following command will open a powershell window with admin privilages.

start-process powershell -verb runas

However I find this as also a long command. So I just put this inside a function called sudo and put inside my powershell profile.

So my powershell profile looked like this.

function sudo {
start-process powershell -verb runas
}

Now, I can just run sudo from my normal powershell window and it will open an elevated prompt. Much faster, much efficient.

sudo

Although this works, ultimately this is not what I want. I want to be able to do the below without installing third party tools:

1) run within the same window ( without opening another window)

2) able to run certain commands as elevated without opening a whole powershell window – just like sudo

Will look forward to work on this, and if I get to do it, will update here.

How to fix AWS SES domain verification failures in Godaddy domains

When we try to register a domain with AWS SES, they will ask us to add some TXT records in our DNS records. The sample TXT record that AWS asks us to add will be  as follows :

Name Type Value
_amazonses.example.com TXT pmBGN/7MjnfhTKUZ06Enqq1PeGUaOkw8lGhcfwefcHU=

However, if we add this in our Godaddy console, AWS is unable to verify the domain and it will show a failure notification. The way to fix is simple, just remove our domain name from the Name field. That is, update the record as below:

Name Type Value
_amazonses TXT pmBGN/7MjnfhTKUZ06Enqq1PeGUaOkw8lGhcfwefcHU=

The same is applicable for updating the DKIM records also.

How to change the OpenSSH server port in Windows

The installation for OpenSSH server in Windows is quite straightforward, as all we need to do is to follow the instructions here.

However, it took a while for me to figure out how to change the listening port. Especially, in the installation folder, there is a sshd_config_default file. Changing this does not make any difference. OpenSSH-InstallFolder

After digging through the documentation, I finally found that in Windows, sshd reads configuration data from %programdata%\ssh\sshd_config .

So that is where the config file is located.

OpenSSH-SSHD-Config

Changing the port is easy, all we need to do is to edit the line that specifies the port, and restart the sshd service.

OpenSSH-SSHD-Config-Port.JPG

TIL how to disable the timeout of mapped drives in Windows

I was trying to copy some big files over the network to another server, and I mapped the destination drives in the local server for easy copying. However I kept getting errors in my script, which I suspected because the drives getting disconnected. Here is how to set it to not disconnect.

Run as administrator in a command prompt, where -1 means disable.

net config server /autodisconnect:-1

TIL how to check the bandwidth between 2 servers

We subscribed for a dedicated line between 2 datacenters, and when we were trying to copy some files over, it was really slow.
We were supposed to get few MB/s transfer rate, but were getting only 20KB/s which was unacceptable. We needed to make a clear case with the service provider to get their support on fixing this. Simple google searches showed me the tool called iperf, and it provided me what I wanted. It is a shame that I never knew this tool existed.

In the destination server, I ran the iperf server by below command:

.\iperf3.exe -s -p 136 (I only have few ports open in between, and that is not currently in use)

and in the source, I ran the iperf client:

.\iperf3.exe -c <Destination IP> -p 136

The results were enough to convince the service provider to fix their network.

This is probably the most basic test that can be done using the tool, but there are plenty of other options as documented here.

How I recovered an unbootable Linux server hosted in OVH/SoYouStart

My friend has a server hosted in SoYouStart, and suddenly the server went down and nobody knew why. He asked my helped to bring it back online.

We tried to boot into recovery mode from the console, and we were able to login using the credentials sent by them. We were also able to download all the files as a backup. However once we boot normally, the server was still not reachable.

Suspecting its a  bootloader issue, I tried to re install grub, and it worked. This is how I fixed after logging in in the rescue mode.

$ fdisk -l (to find the names of physical drives, something like “/dev/sdxy″ – where x is the drive and y is the root partition.Ours was a RAID setup, so /boot was on /dev/md1, / was on /dev/md2)

$ mount /dev/md2 /mnt (Mount the root partition)
$ mount –bind /dev /mnt/dev
$ mount –bind /proc /mnt/proc
$ mount –bind /sys /mnt/sys

$ chroot /mnt (This will change the root of executables to your your drive that won’t boot)
$ mount /dev/md1 /boot (Mount the boot partition. If /boot is not a separate partition, no need to do this step)
$ grub2-mkconfig -o /boot/grub2/grub.cfg
$ grub2-install /dev/sda (/dev/sda and /dev/sdb were the physical disks, not partitions used in the RAID setup. If it is not a RAID, the you should use the disk where /boot is installed)
$ grub2-install /dev/sdb

Ctrl+D (to exit out of chroot)

$ umount /mnt/dev
$ umount /mnt/proc
$ umount /mnt/sys
$ umount /boot
$ umount /mnt

Reboot!

We finally managed to bring up the server which was down for two weeks. Felt so proud of it.

 

How to fix the error “an authentication error has occurred(code 0x80004005)” when connecting through RDP.

<<This site is migrated to https://ebinissac.me. Please click here to see the updated post>>

Ever since the policy to disable TLS 1.0 was pushed down to the local machines, we started getting the error “an authentication error has occurred(code 0x80004005)”  when accessing few of our Windows 2008 R2 servers. It was interesting because we have a bunch of other servers with no problems accessing. This seems to be a very generic error code as google results were showing multiple problems and multiple solutions for this.

Apparently, in my case, the patch to add RDS support for TLS 1.1 and TLS 1.2 was not installed in 3 of the servers with this problem. So I had to download the patch from this Microsoft website and install and reboot them remotely. Once installed and rebooted, voila!!

Edit: There are multiple reasons for this problem, and there are multiple ways to fix it. I am just sharing what worked for me, and it may not work for all. Saying this because this post is now in the first page of google results and I am getting lot of traffic to this, and I do not want to disappoint you. Thanks

Fix “Failed to open Group Policy Object. You might not have the appropriate rights” error

I was facing this error in one of my servers while trying to open gpedit, with additional message “The volume for a file has been externally altered so that the opened file is no longer valid”.

Here is how  I fixed it.

1) Enable view hidden files from explorer.

2) Navigate to C:\Windows\System32\GroupPolicy\Machine

3)Rename the file Registry.pol to something else.

4) Run gpupdate

By now, I was able to open gpedit normally.

Note that by doing this, whatever polices in the local group policy settings will be gone. There will only be settings from the domain policy. So if you have made any local policy changes, they need to be re-done.

Fix problems in trusting files from a DFS namespace

So we migrated our fileshare to a DFS namespace and we started facing a lot of problems. One of the most annoying one was, no matter how we trust the source, the powershell scripts from the DFS namspaces were not able to run, as they give warning.

Interestingly, problem was that the FQDN of the DFS namespace is considered as an internet location by windows, causing it to not trust the location. This can be fixed by editing the local group policy.

Group policy editor -Computer configuration- Administrative templates – Windows components -Internet Explorer – Internet Control Panel – Security Page – Site to Zone Assignment List.

fqdn