Subversion Over SSH on an Unmanaged VPS

Linux No Comments

See my previous post, Upgrade Ubuntu and LEMP on an Unmanaged VPS, to learn how to upgrade LEMP and Ubuntu to the latest versions. In this post, we will install Subversion on the server and learn how to access it using Subversion over SSH (svn+ssh).

Subversion allows a client to execute svn commands on the server over SSH. As a result, there is no need to have a Subversion server process (svnserve) running or an Apache server configured to support Subversion (mod_dav_svn); one only needs SSH access. Subversion over SSH is simple and sufficient for my needs.

For svn+ssh, access to Subversion is controlled by the Linux user login. To avoid having to input your SSH login password every time you run a svn command, I recommend configuring SSH with public key authentication between your client and the server. For instructions, see the “SSH With Public Key Authentication” section in my previous post, SSH and SSL With HostGator Shared Web Hosting.

To begin, on the server, install the Subversion package and create a repository:

# Install subversion
sudo apt-get install subversion

# Check that subversion is installed
svn --version

# Make a repository directory
sudo mkdir /var/repos

# Create a repository
sudo svnadmin create /var/repos

We need to change the permissions on the newly-created repository directory so that our Linux user can have read-write access. I recommend adding your user to the ‘www-data’ group and giving that group modify access to the repository like so:

# Change myuser's primary group to www-data
sudo usermod -g www-data myuser

# Check by showing all groups that myuser belongs to
groups myuser

# Change repository group owner to be www-data
sudo chgrp -R www-data /var/repos

# Add group write permission to repository
sudo chmod -R g+w /var/repos

On the client machine, we will use the Subversion client with svn+ssh to access the repository. Because we are using a custom SSH port and the Subversion command line does not provide an option to input the SSH custom port, we have to configure SSH to use the custom port automatically.

Configure SSH to use the custom port when connecting to your server by creating a SSH configuration file located at “~/.ssh/config” (on Mac OS X) or “%HOME%/.ssh/config” (on Windows). Input the following file content:

Host myserver.com
  Port 3333
  PreferredAuthentications publickey,password

After this, you can run “ssh myuser@myserver.com” instead of “ssh -p 3333 myuser@myserver.com” because SSH will use the custom 3333 port automatically when connecting to “myserver.com”.

Note: On Windows, I am using the DeltaCopy “ssh.exe” client in combination with the CollabNet “svn.exe” Subversion client. On Mac OS X, I am using the built-in ssh client and the svn client (installed using MacPorts).

To test access to the repository, run the following command on the client:

# List all projects in the repository.
svn list svn+ssh://myuser@myserver.com/var/repos

This command will return an empty line because there are no projects in the repository currently. If you do not see an error, then the command works correctly.

On the client, you can now issue the standard Subversion commands like the following:

# Import a project into the repository
svn import ./myproject svn+ssh://myuser@myserver.com/var/repos/myproject -m "Initial Import"

# The list command should now show your newly-imported project
svn list svn+ssh://myuser@myserver.com/var/repos

# Check out a local, working copy of the project from the repository
svn co svn+ssh://myuser@myserver.com/var/repos/myproject ./myproject2

# View the working copy's info (no need to input the svn+ssh URL once inside the project)
cd ./myproject2
svn info

# Update the project to the latest version
svn update

And we are done. Hopefully the above info will be useful should you ever need to get Subversion working.

No Comments

Upgrade Ubuntu and LEMP on an Unmanaged VPS

Linux No Comments

See my previous post, Nginx HTTPS SSL and Password-Protecting Directory, to learn how to configure Nginx to enable HTTPS SSL access and password-protect a directory. In this post, I will explore how to upgrade LEMP and Ubuntu.

Upgrade LEMP

While one can upgrade each component of LEMP (Linux, Nginx, MySQL, PHP) separately, the safest way is to upgrade all software components installed on the system to ensure that the dependencies are handled properly.

Upgrade all software packages, including LEMP, by running the following commands:

# Update apt-get repositories to the latest with info
# on the newest versions of packages and their dependencies.
apt-get update

# Use apt-get dist-upgrade, rather than apt-get upgrade, to
# intelligently handle dependencies and remove obsolete packages.
apt-get dist-upgrade

Updating the PHP-FPM (FastCGI Process Manager for PHP) service may prompt you to overwrite the “/etc/php5/fpm/php.ini” and “/etc/php5/fpm/pool.d/www.conf” configuration files with the latest versions. I recommend selecting the option to show the differences, making a note of the differences (hitting the “q” key to quit out of the compare screen), and accepting the latest version of the files.

Update the two PHP-FPM configuration files with these changes to ensure that Nginx will successfully integrate with PHP-FPM:

# Fix security hole by forcing the PHP interpreter to only process the exact file path.
sudo nano /etc/php5/fpm/php.ini
   # Add the following or change the "cgi.fix_pathinfo=1" value to:
   cgi.fix_pathinfo=0

# Configure PHP to use a Unix socket for communication, which is faster than default TCP socket.
sudo nano /etc/php5/fpm/pool.d/www.conf
   # Keep the following or change the "listen = 127.0.0.1:9000" value to:
   listen = /var/run/php5-fpm.sock
   # The latest Nginx has modified security handling which requires
   # uncommenting the "listen.owner" and "listen.group" properties:
   listen.owner = www-data
   listen.group = www-data
   ;listen.mode = 0660

# Restart the PHP-FPM service to make the changes effective.
sudo service php5-fpm restart

Test by browsing to the “info.php” file (containing the call to “phpinfo” function) to ensure that Nginx can call PHP-FPM successfully. Hopefully, you won’t see the “502 Bad Gateway” error which means it didn’t. If so, look at the Nginx and PHP-FPM error log files for hints on what could have gone wrong.

sudo tail /var/log/nginx/error.log
sudo tail /var/log/php5-fpm.log

Note: If you accidentally select the option to keep the current version of the PHP-FPM configuration files and now wish to the get the latest versions, you will need to uninstall and re-install the PHP-FPM service:

sudo apt-get purge php5-fpm
sudo apt-get install php5-fpm

You will then need to update the two PHP-FPM configuration files per the instructions above.

Upgrade Ubuntu

The following is particular to my VPS provider, Digital Ocean, but perhaps it may help provide a general idea on what to expect with your own provider when doing an operating system upgrade.

On logging into my server, I saw the following notice:

New release '14.04.1 LTS' available.
Run 'do-release-upgrade' to upgrade to it.

Your current Hardware Enablement Stack (HWE) is no longer supported
since 2014-08-07.  Security updates for critical parts (kernel
and graphics stack) of your system are no longer available.

For more information, please see:
http://wiki.ubuntu.com/1204_HWE_EOL

To upgrade to a supported (or longer supported) configuration:

* Upgrade from Ubuntu 12.04 LTS to Ubuntu 14.04 LTS by running:
sudo do-release-upgrade

When I ran “sudo do-release-upgrade”, there was a dire warning about running upgrade over SSH (which I ignored) and many prompts to overwrite configuration files with newer versions (which I accepted after taking note of the differences between the new and old versions). There was also a warning about how the upgrade could take hours to complete, though it ended up taking less than 15 minutes. The upgrade ended with a prompt to reboot, which I accepted.

After reboot, I updated the two PHP-FPM configuration files, “/etc/php5/fpm/php.ini”
and “/etc/php5/fpm/pool.d/www.conf”, per the instructions in the previous section.

In addition, I had to re-enable sudo permissions for my user by running the following:

# visudo opens /etc/sudoers using vi or nano editor, whichever is the configured text editor.
# It is equivalent to "sudo vi /etc/sudoers" or "sudo nano /etc/sudoers" but includes validation.
visudo
   # Add myuser to the "User privilege specification" section
   root       ALL=(ALL:ALL) ALL
   myuser     ALL=(ALL:ALL) ALL

I found the upgrade process, especially upgrading the Ubuntu operating system, to be a relatively painless experience. Hopefully you will find it to be the same when you do your upgrades.

See my followup post, Subversion Over SSH on an Unmanaged VPS, to learn how to install and use Subversion over SSH (svn+ssh).

some info above derived from:

No Comments

Roadblock To Nirvana

Life and Self No Comments

Imagine that you have made a commitment to always remind yourself that reality is an illusion. That the floor you stand on, the air flowing into your lungs, your body, and the people and places around you are all illusions. That everything and everyone is not real. Whenever you remember to, you make this reminder to yourself. When washing dishes or stopped at a light in your car. Several times a day. For a year.

Brainwashing Myself

043GirlOnBeachThat’s what I did. I consciously brainwashed myself for a whole year. Good idea? I’m not so sure. I experienced many wonderful benefits but encountered one major downside.

Believing that reality is an illusion does have a basis in science. It does not need to be a fanciful flight of the imagination. At the basic quantum level, everything can be regarded as probabilities. In this instance of time, the probability collapses that a quark exists at this position in space. Other probabilities collapse into other quarks at the same position in space to create a neutron. And other probabilities collapse into positrons, neutrons, and electrons to make an atom. And this atom, in combination with other atoms, forms a living cell and with other cells, forms your body. All these gazillion probabilities collapse into you, a thinking being, at this exact moment. And they collapse into you again in the next moment and the next. It’s a miracle that we don’t fly apart, scattered across the universe, blinking into and out of existence.

Expand that miracle to include each of the billions of humans, the Earth we inhabit, and the universe around us. It is awe inspiring. As I convinced myself more and more that all was illusion, I grew more amazed at reality. When I’m hiking, I have to stop and let the marvel of nature wash over me. What possibilities exist for there to be this majestic valley and mountain before me? Sometimes, I spend minutes looking at my hands, wondering at its existence and at the fact that I can move it with my thoughts. Amazement soon moves into gratitude for my existence and the endless wonders that surround me.

Note that I’m not in the state of amazement and gratitude continuously, all the time. This mindfulness infrequently comes and goes. Similar to how I have to continually remind myself to remember that reality is illusion, I have to remind myself to be amazed and grateful. I think this infrequency is a very good thing. Because humans adapt, I’m sure that after a while of being continually in such an “enlightened” state, it would start to be dull and average.

Don’t Sweat the Small Stuff

The goal I was shooting for with this mind experiment was to attain better detachment from worldly concerns. And I believe the experiment succeeded. What does detachment sound like? The phrase “don’t sweat the small stuff… and it’s all small stuff” (from a book title) comes to mind. Small stuff like almost hitting a car that cuts suddenly into your lane on the freeway, or having to wait at the checkout line because someone ahead of you has an issue. Because all is illusion, why should I be attached to events and their outcomes? Everything is as it should be. The near collision wasn’t a collision. I am here, waiting in this line, because I am supposed to be here. And I’m doing what I am supposed to be doing. No decision can be wrong if everything is an illusion, including the decision itself.

Detachment is great. Most of the time, I’m not stressing out about what goes on, wondering if I’m making the right choice, or worrying about the future. I am just calm, relaxed. Now, I still do things that I have to do, such as paying the bills and being productive at work, to avoid the possibility of externally-induced stress, such as getting evicted or not having enough money for food. There is only so far you can go before reality takes a bite. I’m sure that a truly detached being wouldn’t care about what happens to his body, but I’m still concerned about not experiencing bad things like pain and starvation.

Other than the survival stuff, I expect things to work out. If they do, great. If they don’t, that’s okay. Surprisingly, most of the time, things work out for the best; if not immediately, then in the near future. Sometimes I think it’s the worse, but then a twist occurs and it’s actually for the best. My previous car, a Jetta, developed a weird crayon smell in hot weather caused by the decay of the sound absorption material Volkswagen used. I was a bit vexed because the issue was a manufacturer defect but the car was out of warranty. I decided to live with it. Months later, my sister upgraded and offered to give me her Civic. I sold the Jetta and ended up with a car which was more reliable, used cheaper gasoline (regular, not premium), and was less expensive to maintain. Because I try not to expect good results, I’m pleasantly surprised when things just work out for the better. The universe (or if you prefer, God) knows what it’s doing.

Do Sweat The Big Stuff

The good thing about detachment is that I’m floating along in life, without stress. The bad thing about detachment is that I’m floating along in life, without stress. I’ve realized that self-induced stress is necessary to push me beyond my comfort level, to take action, and to force me to grow. Without stress, I feel like I’m at a dead-end. I’m no longer clawing my way up the corporate ladder. I’m not pushing myself to arrange get-togethers and activities with friends. I’m not under a time crunch to do things, professionally or personally. I get what needs to be done done, but I don’t push myself to go the extra mile. Sometimes I miss being busy, having to sweat about juggling family, friends, work, personal life, and their related dramas. I miss feeling like I’m accomplishing a lot.

“Creativity requires action, and part of that action must be physical. It is one of the pitfalls of Westerners adopting Eastern meditation techniques to bliss out and render ourselves high but dysfunctional. We lose our grounding and, with it, our capacity to act in the world. In the pursuit of higher consciousness, we render ourselves unconscious in a new way. Exercise combats this spiritually induced dysfunction.” – Julia Camera, The Artist’s Way

When I know that my accomplishments have no meaning, it takes a lot of wind from feeling triumphant. There is a lack of motivation and a lot of passivity. I’m not driving. The universe directs my life… I’m just waiting for it to send experiences my way. The ride is very pleasant but I’m not going anywhere. I’ve gone too far to the other end and am falling down a puddle without end. As with anything in life, when you reach a roadblock or dead-end, it’s time to look for a detour.

“Don’t sweat the small stuff… and everything is small stuff” is right and wrong. While learning to not sweat the small stuff, I should have kept sweating the big stuff. The big stuff is not everything, it’s just the important thing. And it is right that I should be sweating the big stuff in my life. Sharing experiences with family, friends, and coworkers. Figuring out what is important and meaningful to do with my life. Having compassion for my fellow humans in their misfortunes and sharing the joy in their triumphs. Putting myself out into the world, making mistakes, getting bruised, meeting cool people, and learning and doing crazy, interesting things. Trying to be a better person than I was yesterday. And, though I hate to say it, forcing myself to eat healthy and exercise. Those are the big stuff that matters.

Good Vs Bad Procrastinators

I admit that is an awful amount of big stuff to sweat, especially for a procrastinator such as myself. As with anything in life, procrastination is good and bad, depending upon how you handle it. Being a “good” procrastinator, I push myself to be creative about eliminating unnecessary work and doing the remainder in a way that requires much less effort and time. As a software engineer, I would spend my time to write a program to do repetitive work, instead of doing the repetitive work myself, especially if I know it is recurring work. And I write how-to instructions in detail, in this blog and in a wiki at work, because it saves me a ton of time when I can quickly refer people asking for help to an article and when I need to remind myself how to do something. (You probably have experiences with coworkers who ask you a question that you take the time to answer and a few days later, they ask you the same question again. I just point them at the wiki repeatedly until they understand that I won’t humor them and they go off to find another “victim”.)

Sometimes, if I feel a task is not important, waiting until near the deadline to do it can save me time when I find out the that task is no longer necessary. The downside is that if I’m wrong, I may have to work extra hard to meet the deadline; but that is the acceptable risk and most of the time, I come out ahead. Good procrastinators are careful gamblers who figure out the odds that something needs to be done. Bad procrastinators are bad gamblers who bet their energy on things that don’t need to be done, ignoring the important work.

A Simple Plan For The Rest Of My Life

Back to the main topic, I believe I have found a simple 2 step plan to making progress on the big stuff. I created this plan by merging ideas from two books, “The Now Habit at Work” by Neil Fiore and “Why People Fail” by Simon Reynolds. This plan will work for procrastinators, and if it works for procrastinators, it should work for everyone else.

Here is the simple 2 step plan for sweating the big stuff:

  1. Commit to start creating for 15 minutes each day or whatever is most attainable. (For the truly bad procrastinators, 5 minutes might be a better starting point.)
  2. Increase the time as you feel comfortable to.

Why the weird “start creating” phrase? Let’s address the latter part first, as it is the most important. I believe that the purpose of life for all humans is to create. We express our most pure nature when we do something creative, like drawing a picture, writing a blog, inventing a new skateboard trick, building furniture out of discards, combining flavors into a new dish, and testing a new prank on a friend. When we create, we are pushing our limits to bring something new into existence. It reminds me of the phrase, “man is created in the image of God”. To me, that phrase means that when we are creating, we are closest to God.

When I look closer at my big stuff, I find that the biggest, most challenging stuff has to do with creation. Creating enriching moments with my family, friends, and coworkers. Trying to find my life’s purpose can more accurately be stated as creating my life’s purpose. Creating a better life for myself and my fellow beings. Creating a new, better version of myself each day. If I keep my focus on the big stuff and commit to creating each day, my creations will naturally become the big stuff. I sweat the big stuff by creating it.

Self Help For The Procrastinator

The “start creating” phrase and “15 minutes” time limit are for procrastinators or people who lead very busy lives. I synthesized this approach from two ideas about overcoming procrastination, getting started from the “The Now Habit at Work” and limiting time commitment from “Why People Fail”. I hesitate to take on tasks because I don’t want to commit to spending my time and effort to complete them. The bigger the task and the greater the effort required, the more I hesitate. However, if I think that a task is short (just 15 minutes or even 5 minutes) and the effort is small (I’m committing to starting the task, not finishing it), I don’t mind taking it on. Once I get started, I usually go for longer than 15 minutes and sometimes a couple of hours later, I will complete the creation. (This method is how I usually complete postings for this blog.)

It’s a mind trick I play on myself. The goal is to do something (anything) creative every day and the secondary goal is to establish a daily habit of creation. Once I have the habit, I won’t need the trick as much. Or if the habit never develops (usually true for a procrastinator), I may just have to live with the trick. I can still “start” the same thing today that I’ve started the past 20 days or that I’ve started a year ago. Off and on, I’ve “started” this blog post more than two dozen times already.

Before I end this posting, I want to talk about learning. Sometimes I confused learning with creating. When I’m surfing the Internet and learning new things, could that be considered part of my commitment to doing something creative each day? The answer is yes and no. There is a gray area between creating and learning. I believe that learning may be required to meet the goal of creating, but it should never be the goal itself. For example, how can I invent a new skateboard trick if I don’t know how to skate and can’t perform any of the existing tricks? How can I create a video game if I don’t learn how to program first? So if you are learning in order to surpass that learning and create, then yes, it meets your commitment to start creating each day. If you are learning just to learn, then no, it does not meet the commitment to create.

Start Creating Every Day

Aya_KitouIf we are made in the image of God, then his greatest gift to us is creation in its many splendors. Are we not then made to create? By creating, do we not express our truest nature and offer thanks and gratitude to God?

I wanted to end with a story about Aya Kitou. She was a young woman, a teenager who was stricken with a disease, Spinocerebellar Ataxia, which took her mobility and ultimately her life. Yet, she never gave up on living, on growing, on creating her future, however imperfect and dark it became. From her diary, a quote (translated from Japanese) stands out: “So fall down, get up, and smile because you are alive and experiencing this wonderful gift of life.”

No Comments

Judging vs Perceiving Dominant Types

Life and Self No Comments

In a previous post, MBTI: Not Misleading, Just Misunderstood, I mentioned judging and perceiving dominant functions. I was talking to a friend and believe that I have stumbled upon a good way to differentiate between the two. Consider judging and perceiving in terms of communication.

smurf-devilvsangelJudging dominant types are almost always judging what they hear and what they say, but their judgments are works in progress. Such types will give clues, verbal and nonverbal, as to the degree of certainty in what they are saying. They would say something like “In most cases, I think that this is true.” And the progression over time is that the judgments will grow from less certain to very sure. Eventually, they will say, “This is true.” Some things are not judged, some start with no previous judgments, and others have strong judgment immediately (the result of similar decisions in the past). As judgers examine and re-examine, they re-judge until they reach a point where they are strongly certain.

Conversely, perceiving dominant types do not immediately judge unless they have made similar decisions in the past. Without prior judgments, they wait until the very end to decide. Unfortunately, they (especially dominant perceivers with thinking secondary) usually do not give any clue (verbal or nonverbal) as to the level of certainty of their statements, mainly because they have not decided anything as of yet. They would say, “This is false”, and say it with no, little, some, or total certainty; a listener can’t be sure which. Perceivers wish to explore all the possibilities first before deciding on one. What they state may not be certain at all because no judgment may be attached.

The two different styles result in conflict when they communicate with each other. Judging dominants are continuously judging and communicating their level of certainty. Early on or midway through the discussion, they may say “I believe this is so but I may be wrong”. Perceiving dominants instead will state “This is so” without indicating any level of certainty.

Each type believes they are communicating with their own type. Unfortunately, this is may not be the case. A judging dominant would believe that a perceiving dominant is very certain (because there is no verbal qualifier to the statement made) and attempt to figure out why the perceiver believes his statement to be true. A judger would then ask questions and come up with exception cases. The perceiver, who is “just throwing it out there”, is wondering why the judger is questioning him about what he said and attempting to close off possibilities prematurely. The perceiver first responds by trying to answer the judger’s questions, quickly gets annoyed when the questions persist, and start throwing out other statements or possibilities in reply to the exception cases. The judger gets vexed because he views the perceiver as not willing to explain why, making tangential statements that may conflict with each other, and changing judgments randomly. The end result is a communication breakdown and irritation with each other.

The above situation is made worse if any of the two types do not possess high emotional maturity and strong self-esteem. A perceiving dominant would feel under attack by the judging dominant. The perceiver would wonder why the judger is questioning him. A judging dominant would feel that the perceiver is not being serious, being disrespectful and making fun of him. Negative emotions are mixed into the irritation cycle and can build up to eruptions in anger with each other.

Perhaps a better understanding can be arrived at when we consider how each type brainstorms. When brainstorming, judging dominants will make continual judgments that they refine using exception cases. Judgers will examine the feasibility of a possibility before moving on the next one; judgers explore depth first. Perceiving dominants wish to explore all the possibilites before determining any possiblities’ feasibility; perceivers explore breadth first. Seeking refinement, judgers will look for exceptions to what is suggested by a perceiver. The perceiver may feel that is too premature to close off the possibility by questioning. Because statements made by perceivers are viewed as very certain, judgers may feel that it is too premature to make such judgments so soon. Brainstorming becomes an unpleasant experience for both types.

To make my point, I have painted the two types in their extremes. Both types make judgments. For judging dominants, the judgment is spread over the whole process. For perceiving dominants, the judgment is compressed to the end. Unfortunately, a non-judgmental statement said by a perceiving dominant sounds like a definitive statement to a judging dominant. A definitive statement said by a judging dominant sounds like a non-definitive statement to a perceiving dominant. These misunderstandings lead to a communication cycle which will frustrate both types.

The cycle can only be broken if at least one of the types do not assume that they are talking to their own type and make the effort to determine the type they are speaking to. Judging dominant types should not ascribe certainty of judgment to statements made without qualifiers by perceiving dominant types. Absent any expression of degree of certainty, the judger should ask directly whether what was stated is a definitive judgment. Perceiving dominant types should ask if there is any certainty to another’s statement, instead of assuming none, and if possible, provide verbal indicators as to the certainty of their own statements.

When you find yourself getting irritated in a discussion, stop and ask yourself whether your conversation partner is of a different type. Adjust accordingly and with patience, you will communicate and feel better and so will the other person.

No Comments

Real Insurance Is Better than AppleCare

Work and Money No Comments

Recently, my brother-in-law decided to buy a 13 inch Apple Macbook Pro with Retina. He asked me if the AppleCare Protection plan, which costs $249 extra, was worth it. I told him no. For the same amount of money or less, one could get a better protection plan than AppleCare.

broken-mask-robotThe AppleCare Protection plan is a 3 year extended warranty plan. It only covers malfunctioning parts. It does not cover accidental damage, loss, or theft. If you drop the Macbook and the display cracks or the laptop stops working, you are out of luck because AppleCare does not cover that. You will need to pay the full repair price, which could be $1000 or more to replace a retina display. If you spill water on the keyboard and your Macbook shorts out, that’s too bad. If you lose the Macbook or someone steals it from the safety of your house, oh well, that’s the way the cookie crumbles. AppleCare does not cover any of that.

What does AppleCare cover? Well, if your keyboard or display malfunctions through no fault of yours, then Apple will repair or replace that component. The Genius Bar members at your local Apple Store will check the Macbook for damage, such as large dents, that could cause the malfunction. If they find such damage, they can refuse the repair; if you are very lucky, you will get someone nice enough to allow the free repair. Be aware that Apple has put moisture detectors inside the Macbook so that if you spill water on it, the Genius Bar will know and can refuse the repair even if you have AppleCare Protection.

There is a reason why extended warranties are pushed so often by retailers. They make a lot of money off of them. Usually if a product such as a laptop were to fail, it would most likely fail during the first year when the product is still under the standard one year warranty. So if the customer pays for an extra year or two of warranty, that is considered an almost guaranteed 100% profit for the retailer. Not to say that there isn’t any case where a day or two after the one year warranty expired, the product failed and the owner was glad to have paid for extended warranty. That case is the exception though. The odds suggest not buying an extended warranty.

When most people buy AppleCare, I think they believe that they are buying insurance. They aren’t. Insurance could cover repair or replacement due to accidental damage, loss, or theft. AppleCare is not insurance. For the same amount of money, they could get real insurance that provides greater peace of mind.

First, before we talk insurance, make sure that you buy that expensive Macbook with an American Express credit card. (If you don’t have an American Express card, get one. There are basic American Express cards with no annual fee that provide the two benefits below.) Purchasing an item with an American Express card provides the following two benefits for free:

  • Purchase Protection: your purchase is protected for up to 90 days from the date of purchase from accidental damage or theft. You will be reimbursed up to $1000. (If you have an American Express Platinum card or similar, you get protection from loss also and up to $10,000.)
  • Extended Warranty: doubles the warranty on your purchase up to an additional year. For an extra year after the original warranty expires, if the purchase malfunctions through no fault of yours, you will be reimbursed the original cost (up to $10,000).

Purchasing the Macbook with an American Express credit card will add one additional year of extended warranty on top of the standard Apple one year warranty for free. It is a no brainer to do so. American Express service is very friendly. For example, the Wifi feature on my sister’s iPhone 4s broke (due to Apple hardware defect) after she upgraded to iOS 7. Because it was past the one year warranty, Apple wanted $200 to replace the iPhone. I told my sister to call American Express to see if she was eligible for the free extended warranty. She was. They asked her to send them the receipt and then gave her a credit for the original cost of the iPhone. How cool is that?

Second, before buying insurance for your Macbook, check that you don’t already have it. You may have a rider on your house or rental insurance that covers your personal property such as electronics. Check to see what is covered. For example, my renter’s insurance covers theft and loss of personal property due to fire (burst pipes, etc.). Unfortunately, accidental damage is not covered. In the case of theft or loss, my rental insurance company will reimbursed me for the deprecated cost or if I purchase a replacement, they will cover the original cost. (This is nowhere as nice as American Express’ full credit of the original cost without requiring you to buy a replacement.)

If you don’t have a rider for personal property like electronics, you may want to ask your home insurance company about one. It may be the cheapest option because a rider is considered part of the bundle and you may get a better deal that way. Homeowners on forums provided some examples such as $40/year for $2000 coverage (much less than AppleCare) or $80/year for $5000 coverage for malfunctions, accidental damage, loss, and theft. The insurance cost seems to vary widely (one homeowner mentioned $15/year, another $30/year for $4000 coverage, and a third mentioned a deductible of $50).

Instead of a rider, you can purchase personal property insurance directly. Most likely, standalone insurance will be more expensive than a rider. The most recommended standalone insurance for a Macbook is an Inland Marine Insurance policy (can be gotten from several insurance companies like AllState, State Farm, and Farmers Insurance). Though originally created to cover expensive electronics on boats, the policy applies for land usage also. In a forum post (Best Insurance on Earth for your MacBook / Air etc. Far better than AppleCare), one person mentioned that it costs about $32/year for $1500 coverage of multiple devices with no deductible. (It was mentioned also that the Inland Marine Insurance policy did not cover phones.)

Another insurance mentioned was State Farm’s Personal Articles Policy, which costs $60/year for $3000 coverage. There are also insurance companies, like Safeware, that sell policies specific to high-end equipment and electronics. As with all other types of insurances, make sure to shop around to get the best deal.

When talking about insurance for electronics, Square Trade is a name that often comes up. Square Trade costs about the same as AppleCare; but in addition to the extended warranty, Square Trade covers accidental damage. Because Square Trade does not cover loss or theft, I believe that it is not the best deal. Square Trade is better than AppleCare but you can get better insurance than both for less.

Insurance is personal. I do not purchase extra insurance, such as extended warranties. (I do use an American Express card to take advantage of the free extra year of warranty though.) I am gambling that I’ll be careful enough not to break my Macbook, lose it, or be robbed. Considering all the money I have saved from not buying extra insurance or extended warranties on my many laptops, even if I have to pay full price to replace a Macbook, I will break even or come out ahead. However, if you feel more comfortable having some protection (nothing wrong with that), please consider the alternative options above to AppleCare. You will get much more bang for your buck.

No Comments

Update to Latest Subversion Using MacPorts

Mac OS X No Comments

Because I make use of MacPorts to install my developmental tools on Mac OS X, installing or updating Subversion is simple, consisting of a single command line.

Install MacPorts

If you don’t already have MacPorts, go ahead and install it. MacPorts depends upon Xcode and the Xcode Command Line Tool. Instructions for installing both are provided on the MacPorts website.

Update MacPorts

Before installing or updating Subversion, you will want to update MacPorts by issuing this command:

sudo port -v selfupgrade

Install Subversion

To install Subversion, issue a MacPorts command to install it like so:

sudo port install subversion subversion-javahlbindings

The Subversion JavaHL Bindings (“subversion-javahlbindings”) package is necessary to support integration with Eclipse, specifically using the Subclipse plugin. Thank heaven that MacPorts got around to supporting the Subversion JavaHL Bindings installation. Before, I had to manually find a compatible version of the JavaHL Bindings, download, and install it myself.

Note: When installing the Eclipse Subclipse plugin, you will need to select the specific Subclipse version that uses a Subversion version that is the same as your installed Subversion and JavaHL Bindings. The version numbers don’t match so you will need to look at the Subclipse documentation to determine which version of Subclipse to install. For example, Subclipse 1.10 uses the latest Subversion 1.8.

Update Subversion

You can update Subversion specifically or update all outdated MacPorts-installed packages by issuing these commands:

# Update only Subversion and JavaHL Bindings
sudo port –v upgrade subversion subversion-javahlbindings

# Update all outdated installed packages including Subversion
sudo port -v upgrade outdated

Install or Update Subclipse

To install or update the Eclipse Subclipse plugin, you will use the same installation instructions. Subclipse doesn’t have a separate update mechanism. To update Subclipse, you would basically install a newer version of it (without needing to remove the older version first).

Note: Eclipse has a menu item, Help->Check for Updates, which will update itself and supported plugins; unfortunately, Subclipse does not support this function.

To install or update Subclipse, follow these steps:

  1. Go to Eclipse menu: Help->Install New Software…
  2. Input “http://subclipse.tigris.org/update_1.10.x” into the “Work with” field and the table will be updated with installation packages available at that location. (Note: Subclipse 1.10 uses the latest Subversion 1.8.)
  3. Check just the Subclipse package and keep clicking Next until the end. Half-way through, you will be asked to accept the license agreement. Select the “I accept the terms of the license agreements” radio button and click Finish.
  4. You will get a security warning popup with the message, “Warning: You are installing software that contains unsigned content.” Click the OK button to proceed.
  5. Eclipse will need to restart. You will be prompted with a “Software Updates” popup asking “You will need to restart Eclipse for the changes to take effect. Would you like to restart now?” Answer Yes.

Use Older Subversion

MacPorts allows you to select an older version of its packages for use, instead of using the latest version. This is useful in case you do an update and realize that you can’t use the latest version of a particular package, perhaps due to software version incompatibility with one of your tools or applications. For example, because the latest version of Subclipse may not support the latest version of Subversion, you may need to force the use of the previous version of Subversion.

To see all the installed versions of Subversion, run this command:

sudo port installed | grep -i subversion

You should see something like the following output:

subversion @1.7.8_2
subversion @1.7.10_1
subversion @1.8.8_0 (active)
subversion-javahlbindings @1.7.8_2
subversion-javahlbindings @1.7.10_0
subversion-javahlbindings @1.8.8_0 (active)

To activate the previous version of Subversion, use these commands:

sudo port activate subversion @1.7.10_1
sudo port activate subversion-javahlbindings @1.7.10_0

If you are using the latest Subversion and want to uninstall all the older versions, run either of these commands:

# To uninstall a specific version of Subversion
sudo port uninstall subversion @1.7.10_1
sudo port uninstall subversion-javahlbindings @1.7.10_0

# To uninstall inactive versions for all packages including Subversion
sudo port uninstall inactive

Eclipse Keeps Asking For Subversion Password

I encountered a bug where Eclipse kept prompting me to input the Subversion password whenever I attempted to run a Subversion command such as update. Even though I checked the save password option, Eclipse would still prompt me each time. I did not encounter this issue using the command line Subversion, so I thought it was a Subclipse bug.

Turns out that this was an Eclipse bug, involving how Eclipse interacted with the Mac OS X Keychain where the subversion password was stored. I used the solution found at the bottom of this page, Subclipse 1.10.0 not saving passwords, to update the Eclipse code signature, which eliminated the password prompts.

Quit Eclipse and run this command:

codesign --force --sign - /Applications/eclipse/Eclipse.app

Run Eclipse and issue a Subversion command like update. If you get a Keychain access dialog, select “Always Allow”.

I’m very glad that MacPorts exist to make installations and updates so painless.

No Comments

Send Me Money, Sucker!

Internet No Comments

Recently, I got an email from a family member which reads as follow:

From: XXXX@yahoo.com
Subject: Urgent!

Hi,

I'm out of town suffering a terrible incident, I need your urgent favor,
Please email me back as soon as possible.

Thanks.

XXXX
(XXX) XXX-XXXX

The displayed email address looks correct as XXXX@yahoo.com, but when I check the headers, the reply-to address is XXXX@outlook.com. And the phone number has the wrong area code.

I recognized it as the money gram scam. Basically, if you reply to that email, you will receive a request to send money by Western Union (or a similar money transfer service), where it is easy for anyone to go and pick up the cash. (If you call the number, you will probably get voicemail.) The way this scam works is to hack into someone’s email account, send this same message to everyone in the address book, and hope that one or two people will fall for it and send money.

ScroogeMcDuckI sent a warning to my family and wasn’t surprised to find that most did not recognize this email as a scam. They were confused or thought it was a joke. The family member, whose email account was hacked, disclosed that several friends and acquaintances were calling to ask why he needed $930. This tells me that a lot of folks are not knowledgeable about Internet scams. I want to talk about scams, Internet and otherwise, and the one method that I use to fight them.

In the past, this and other frauds were perpetrated by isolated con artists. Nowadays, I believe that most of the scams on the Internet are perpetrated by criminal organizations. If I was a mafia boss, I would definitely have an Internet racket because face it, you can make a ton of money (from the hundreds of millions of victims) with very little risk of getting caught or punished, especially if you are located in another country.

So, there are groups of hundreds of criminals, backed by the best servers that dirty money can buy, running scams across the Internet (and elsewhere). They are working full-time to steal money from you and the companies you do business with. If they are truly International, they may be working full time across multiple time zones, while you are sleeping, eating, going to the bathroom, and watching TV.

You may throw up your hands in defeat at this point. And to be truthful, I agree. There is no way you can beat everything that an organization like that can throw at you. The best you can aspire to be is a potential victim that would take too much effort to defraud. Sad to say, your goal is to be less naive than the masses. Or more simply, the criminals will go for the lowest-hanging fruit and your job is to avoid being the lowest-hanging fruit.

The most powerful tool that we victims have in our arsenal is to “trust but verify” or more accurately, verify before trusting. This applies to almost everything in life. To illustrate, one of my friends did fall for the money gram scam above a couple years ago. After sending the money, she had some doubts so she called the friend up and the friend replied, “What? I’m not in XXXX country, robbed of everything, and in need of money!” My question is: Why didn’t she call up the friend or the friend’s family first before sending money? If she had verified first, the friend or the friend’s family would have told her that the email was a fake.

Email Links: Bad Idea

Avoid clicking on any links in an email, especially an email from your bank. Definitely, do not login if the link takes you to a login page where you are prompted to input your username and password. Instead, open up a browser and manually type in the address of your bank or whatever.

If you’re lucky, clicking on links indiscriminately may get your computer infected with a virus or spyware which will just slow down your computer. If you’re unlucky, a virus will erase your hard drive or a spyware will record what you type, like passwords, and transmit the data to someone who doesn’t have your best interest in mind. Worst, if you click on a link to your bank account and input your username and password, you may have just given access to your bank account to a criminal.

The last is referred to as phishing (pronounced like “fishing” because they are “phishing” for your money) which involves pretending to be a trustworthy entity in order to acquire sensitive information. Basically, someone nefarious creates a website which looks exactly like your bank’s login page. They send you a fake email from your bank with a link. When you click on the link, you are taken to the fake login page. After you input your banking username and password, they could then forward you to the real bank or just throw an error that maintenance is in progress. In the meantime, they have your username and password to access your bank account with.

Phishing may be used to gain access to accounts belonging to other companies than your bank, like investment firms, credit card companies, loan application processors, mortgage payment companies, etc. I believe that all legitimate businesses should make it a policy to not include any links in their official emails; instead, they should ask their users to manually browse to their company websites.

Note: If you receive a complicated link in an email, perhaps pointing to a specific Google or Yahoo photo album, which requires a login and you can’t figure out how to manually browse to it, here’s what you can do:

  1. Browse to the company address by manually typing it in, and log into your account.
  2. Go back to the email and click on the link.

If the link is legitimate, the system will recognize that you are already logged in and bypass the login screen. You would then go directly to that page; that is, the photo album. Doing the above will help you to avoid being tricked by a phishing website.

Phone Calls: Just Hang Up

Similar to the above, if you get a phone call from your bank and are asked to verify your identity, ask what the call is above, say bye-bye, and call your bank’s official phone number (listed on the back of your ATM card, their website, or in the phone book). Calling them directly is the equivalent of manually browsing to the company website. If the “bank” calls you and you provide your verification info (mother’s maiden name, social security, etc.), you may have just given your identity away to thieves, who could then gain access to your accounts or more likely, open a new credit card or loan in your name.

Knowing the above, the perpetrators will attempt to override your caution. A year ago, I got a phone call from my credit card company. They told me that they believe my credit card number was stolen because they were seeing charges for flowers amounting to over a thousand dollars in Florida. They asked me to verify my identity so they can confirm that the charges were fraudulent. Of course, I answered every question they asked. Afterwards, I realized with horror that I might have just given the keys to my identity away to someone who “called” me on the phone. Thankfully it was a legitimate call, but it could have easily been a trick. What I should have done was ask them what the call was about, hang up, and call the credit card company back directly.

Phishing: Old as the Pharaohs

The above could have been a phishing attempt. Phishing isn’t something new on the Internet; it has been around for a long time. I’m sure it has been around since mankind first discovered how to cheat and steal. I think all effective scams involve the use of phishing (again, pretending to be a trustworthy entity) because no one hands their money to some entity they don’t trust.

For example, suppose that you are on a business trip. You arrive late at the hotel. You’re hungry but too tired to go out. Conveniently, there is a flyer for pizza delivery that someone slipped under the hotel room’s door. You dialed up the pizza place, make an order, and pay with your credit card. An hour later, the pizza hasn’t arrived yet. You call back and get some lame excuse like the oven has exploded, sorry, but there won’t be pizza for anyone. Or maybe no one picks up. Congratulations, you’ve just had your credit card number stolen.

Remember what P.T. Barnum supposedly said, “There’s a sucker born every minute.” Try not to be that sucker. But if you fall for a scam (which I must embarrassingly admit to once or twice), forgive yourself. You are only human. Just repeat to yourself, “There’s a human born every minute.” (To be exact, there’s a human born every 8 seconds.)

No Comments

The Internet’s Future is Blacklisted

Internet No Comments

Over the weekend, I signed up for a shared web hosting plan because of a special deal. I spent a day setting up the host, migrating a website, and testing to make sure it worked. On Monday, when I went to work, I thought I would check to see the status of my website. Imagine my surprise when I got a security warning that my website was dangerous, known to host viruses and spyware. How could this be? This is a respectable website which I have just moved to a new server.

senderbase_bad_ipIt turns out that my work’s Intranet is protected by a network security appliance called Ironport. Ironport in turn depends upon SenderBase, a blacklist service that identifies dangerous websites. The blacklist is keyed off the IP address. The new server’s IP address was flagged and thus, anything hosted on it (like my website) inherits the negative status.

When we get a shared web hosting account, we are assigned one of the servers which have available capacity. Now, why would that server have excess capacity? Perhaps, it is because a previous user was kicked out for bad behavior, like distributing viruses, spyware, or spam. Well, that someone’s bad behavior got the IP address blacklisted. And now, I am the proud owner of that banned IP address.

Note: The above doesn’t just apply to shared web hosting. If you get a private server or virtual private server, the provider company will give you an available IP address. That IP address could have belonged to someone previously who had misbehaved.

So maybe I and others whose companies use network security appliances can’t browse to my website. So what, we’re supposed to be working, right? Unfortunately, it turns out that email is also affected. If you expect to send and receive mail using your server, the server’s blacklisted IP address could cause all the email traffic to and from your server to get bounced (not delivered).

Worse, as far as I can tell, once the IP address is blacklisted, it is very hard to get that status removed. You’ll have to hope that your hosting provider is motivated enough to go through the hassle of engaging one or more blacklisting companies to remove that negative status. Even if your provider is willing, it will take time before the IP address is cleared.

Having learnt my lesson, the first thing I suggest doing after getting a web hosting or private server account is to check that its IP address is not blacklisted. You can check the IP address on the following websites:

Note: Not all of the blacklists are widely used, so it may be okay for the IP address to be on one or two blacklists. However, to be on the safe side, it is best to have an IP address which doesn’t appear on any blacklist.

If your IP address is blacklisted, ask your hosting provider company for another. If the company won’t accommodate you, then cancel and go with one that will. Believe me, doing so will avoid a lot of wasted effort and work. You don’t want a customer browsing to your company website only to get a stern warning that your website is known to distribute viruses.

I am afraid that I am seeing the future of the Internet. As security concerns grow, companies will invest in solutions, like network security appliances, that make use of blacklists (and maybe whitelists). Heck, if I was in charge of my company’s network security, a network security appliance would be the minimal that I would advocate. I would take more drastic steps like locking down inbound and outbound ports, and aggressively running heuristic checks on all internal traffic to detect viruses and spyware.

No Comments

Make Mac Screen Lock Secure and Convenient

Mac OS X No Comments

The Macbook I got for work is configured to require the password after the screensaver turns on or the display goes to sleep. By default, the screen is set to sleep after 2 minutes of inactivity on battery and 10 minutes on power adapter. When I work on two computers, alternating between the Macbook and a desktop, I hate having to keep inputting the password on the Macbook to unlock it.

I understand the need for security, but I draw the line when it makes using the Macbook too inconvenient. I don’t want to eliminate the password requirement, I just want the screen locks (which require the password to exit from) not to occur so often.

I considered adjusting the power settings so that the Macbook won’t go to sleep until an hour of inactivity occurs on either battery or power adapter. (Likewise, changing the screen saver to wait an hour.) However, making such a change would cause the battery usage to increase (the display uses a lot of power) and require a shorter interval between charges. (To preserve the battery capacity, I usually use the battery until it is very low before charging. And when charging, I try to give it an opportunity to charge to 100 percent.) While I don’t use the Macbook differently on battery versus power adapter, having to charge and being tethered to the wall socket more often is inconvenient.

macbook_screen_lockI found the solution in “System Preferences”, under the “Security & Privacy” section. There is an option named “Require password [time interval] after sleep or screen saver beings” that controls when the screen lock activates. I changed the time interval from the initial 5 seconds to 1 hour. (There are 7 selectable time intervals ranging from immediately to “4 hours”.) Now, when the screen saver runs or the Macbook goes to sleep (for example, when I close the lid), I don’t need to input the password when I wake the Macbook before the 1 hour interval expires.

This setting gave me a good compromise between security and convenience. I am not required to input the password for any inactivity less than an hour and I can leave the power (and screen saver) settings on battery conservation mode.

But what if I need to put the Macbook immediately into screen lock mode? The answer surprisingly lies in the “Keychain Access” application. To support manually locking the Mac, do the following:

  1. Run the “Keychain Access” application (under /Applications/Utilities directory).
  2. Go to the “Keychain Access” menu, Preferences, General, and check the “Show keychain status in menu bar” option.

You should now see a lock icon on the top-right menu bar. When you want to manually lock the Mac, click on the lock icon and select “Lock Screen”.

Hopefully the above will help you to secure your Mac without making it too inconvenience to use.

Note: The “Lock Screen” method above was gotten from Quickly lock your screen. Unfortunately, on Mac OS X Mountain Lion, the re-arrange menu bar icon function (hold Cmd and drag icon left or right) didn’t work so I was not able to get a keyboard shortcut working for “Lock Screen”.

No Comments

Nginx HTTPS SSL and Password-Protecting Directory

Linux 1 Comment

See my previous post, Nginx Multiple Domains, Postfix Email, and Mailman Mailing Lists, to learn how to configure multiple domains and get Postfix email and Mailman mailing lists working on an unmanaged VPS. In this post, I will configure Nginx to enable HTTPS SSL access and password-protect a directory.

Note: Though I’m doing the work on a Digital Ocean VPS running Ubuntu LTS 12.04.3, the instructions may also apply to other VPS providers.

Enable HTTPS/SSL Acess

I have a PHP application which I want to secure. If I use HTTP, then the information sent back from the server to my browser is in clear text (and visible to anyone sniffing the network). If I use HTTPS (HTTP Secure) with a SSL (Secure Sockets Layer) server certificate, then the information will be encrypted. In the steps below, I will configure HTTPS/SSL to work for a domain and then force HTTPS/SSL access on a particular directory (where the PHP application would be located).

To get HTTPS working, we need a SSL server certificate. While you can get a 3rd party certificate authority to issue a SSL certificate for your domain for about $10 per year, I only need a self-signed certificate for my purpose. A 3rd party issued SSL certificate is convenient because if the browser trusts the 3rd party certificate authority by default, the browser won’t prompt you to accept the SSL certificate like it would for a self-signed certificate (which the browser can’t establish a chain of trust on). If you run a business on your website, I recommend investing in a 3rd party SSL certificate so that your website would behave professionally.

Create a self-signed SSL server certificate by running these commands on the server:

Note: You don’t need to input the lines that start with the pound character # below because they are comments.

# Create a directory to store the server certificate.
sudo mkdir /etc/nginx/ssl

# Change to the newly-created ssl directory.  Files created below will be stored here.
cd /etc/nginx/ssl

# Create a private server key.
sudo openssl genrsa -des3 -out server.key 1024
   # Remember the passphrase you entered; we will need it below.

# Create certificate signing request.
# (This is what you would send to a 3rd party authority.)
sudo openssl req -new -key server.key -out server.csr
   # When prompted for common name, enter your domain name.
   # You can leave the challenge password blank.

# To avoid Nginx requiring the passphrase when restarting,
# remove the passphrase from the server key. (Otherwise, on
# reboot, if you don't input the passphrase, Nginx won't run!)
sudo mv server.key server.key.pass
sudo openssl rsa -in server.key.pass -out server.key

# Create a self-signed certificate based upon certificate request.
# (This is what a 3rd party authority would give back to you.)
sudo openssl x509 -req -days 3650 -in server.csr -signkey server.key -out server.crt

Note: I set the certificate expiration time to 3650 days (10 years); 3rd party certificates will usually expire in 365 days (1 year). The maximum expiration days you can input is dependent upon the OpenSSL implementation. Inputting 36500 days (100 years) would probably fail due to math overflow errors (once you convert 100 years into seconds, the value is too big to store in a 32bit variable). I believe the highest you can go is about 68 years, but I haven’t tested it.

Configure Nginx to use the SSL server certificate we created by editing the server block file for the domain you want to use it on:

sudo nano /etc/nginx/sites-available/domain2

In the “domain2″ server block file, find the commented-out “HTTPS server” section at the bottom, uncomment it, and edit it to look like the following:

# HTTPS server
#
server {
        listen 443;
        server_name mydomain2.com www.mydomain2.com;

        root /var/www/mydomain2;
        index index.php index.html index.htm;

        ssl on;
        ssl_certificate /etc/nginx/ssl/server.crt;
        ssl_certificate_key /etc/nginx/ssl/server.key;

#       ssl_session_timeout 5m;
#
#       ssl_protocols SSLv3 TLSv1;
#       ssl_ciphers ALL:!ADH:!EXPORT56:RC4+RSA:+HIGH:+MEDIUM:+LOW:+SSLv3:+EXP;
#       ssl_prefer_server_ciphers on;

        location / {
                try_files $uri $uri/ /index.php;
        }

        # pass the PHP scripts to FPM-PHP
        location ~ \.php$ {
                fastcgi_split_path_info ^(.+\.php)(/.+)$;
                fastcgi_pass unix:/var/run/php5-fpm.sock;
                fastcgi_index index.php;
                include fastcgi_params;
        }
}

Note: The “HTTPS Server” section looks like the “HTTP Section” we configured previously at the top, except for the addition of “listen 443″ (port 443 is the HTTPS port) and the SSL enabling configurations.

Open up the HTTPS port in the firewall and reload Nginx by running these commands on the server:

# Allow HTTPS port 443.
sudo ufw allow https

# Double-check by looking at the firewall status.
sudo ufw status

# Reload Nginx so changes can take effect.
sudo service nginx reload

Test by browsing to “https://mydomain2.com/”. When the browser prompts you to accept the self-signed server certificate, answer Yes.

Require HTTPS/SSL Access on a Directory

To require HTTPS/SSL-only access on a particular subdirectory under the domain, we need to add a directive to the domain’s HTTP Server to redirect to the HTTPS Server whenever a browser accesses that directory.

Note: Apache uses a .htaccess file to allow users to configure such actions as redirecting or password-protecting directories. Nginx does not use .htaccess; instead, we will put such directives in the server block files.

Create a secure test directory by running these commands on the server:

# Create a secure test directory.
sudo mkdir /var/www/mydomain2/secure

# Create a secure test page.
sudo nano /var/www/mydomain2/secure/index.html
   # Input this content:
   <html><body>
   This page is secure!
   </body></html>

# Change owner to www-data (which Nginx threads run as) so Nginx can access.
sudo chown -R www-data:www-data /var/www/mydomain2/secure

Edit the domain’s server block file by running this command on the server:

sudo nano /etc/nginx/sites-available/domain2

Under the “domain2″ server block file, in the “HTTP Section” at the top (not the “HTTPS Section” at the bottom), add these lines to do the redirect:

server {
        #listen   80; ## listen for ipv4; this line is default and implied
        #listen   [::]:80 default ipv6only=on; ## listen for ipv6
        ...

        # Redirect mydomain2.com/secure to port 443.
        # Please put this before location / block as
        # Nginx stops after seeing the first match.
        # Note: ^~ means match anything that starts with /secure/
        location ^~ /secure/ {
                rewrite ^ https://$host$request_uri permanent;
        }

        ...
        location / {
        ...
}

Restart Nginx so the changes above can take effect.

sudo service nginx reload

Test by browsing to “http://mydomain2/secure/” and the browser should redirect to “https://mydomain2/secure/”.

Password-Protect a Directory

By password-protecting a directory (aka requiring basic authentication), when a browser accesses that directory, the user will get a dialog asking for the user name and password. To get this functionality working, we will create a user and password file and configure the Nginx server block to require basic authentication based upon that file.

Note: Accessing a password-protected directory over HTTP would result in the user and password being sent in clear text by the browser to the server.

Create a protected test directory by running these commands on the server:

# Create a protected test directory.
sudo mkdir /var/www/mydomain2/protect

# Create a protected test page.
sudo nano /var/www/mydomain2/protect/index.html
   # Input this content:
   <html><body>
   This page is password-protected!
   </body></html>

# Change owner to www-data (which Nginx threads run as) so Nginx can access.
sudo chown -R www-data:www-data /var/www/mydomain2/protect

We will need a utility from Apache to create the user and password file. Run this command on the server to install and use it:

# Install htpasswd utility from Apache.
sudo apt-get install apache2-utils

# Create a user and password file using htpasswd.
sudo htpasswd -c /var/www/mydomain2/protect/.htpasswd myuser

# Add an additional user using htpasswd without "-c" create parameter.
sudo htpasswd /var/www/mydomain2/protect/.htpasswd myuser2

# Change owner to www-data (which Nginx threads run as) so Nginx can access.
sudo chown www-data:www-data /var/www/mydomain2/protect/.htpasswd

Note: If you move the “.htpasswd” file to another location (say, not under the domain’s document root), make sure that the “www-data” user or group can access it; otherwise, Nginx won’t be able to read it.

Edit the Nginx server block file by running this command on the server:

sudo nano /etc/nginx/sites-available/domain2

In the “domain2″ server block file, in the “HTTP Section” at the top (not the “HTTPS Section” at the bottom), add these lines to password-protect the “/protect” directory:

server {
        #listen   80; ## listen for ipv4; this line is default and implied
        #listen   [::]:80 default ipv6only=on; ## listen for ipv6
        ...

        # Password-protect mydomain2.com/protect directory.
        # Please put this before location / block as
        # Nginx stops after seeing the first match.
        # Note: ^~ means match anything that starts with /protect/
        location ^~ /protect/ {
                auth_basic "Restricted"; # Enable Basic Authentication
                auth_basic_user_file /var/www/mydomain2/protect/.htpasswd;
        }

        ...
        location / {
        ...

        # Uncomment this section to deny access to .ht files like .htpasswd
        # Recommend to copy this to the HTTPS server below also.
        location ~ /\.ht {
                deny all;
        }

    ...
}

The “^~” in “location ^~ /protect/” above tells Nginx to match anything that starts with “/protect/”. This is necessary to ensure that all files and directories under “/protect/” are also password-protected. Because Nginx stops once it finds a match, it won’t process subsequent match directives, such as the PHP-FPM directive, and PHP scripts won’t execute. If you wish to run PHP scripts under the password-protected directory, you must copy the PHP-FPM directive (and any other directives) under the password-protected location directive like so:

server {
        ...

        # Password-protect mydomain2.com/protect directory.
        # Please put this before location / block as
        # Nginx stops after seeing the first match.
        # Note: ^~ means match anything that starts with /protect/
        location ^~ /protect/ {
                auth_basic "Restricted"; # Enable Basic Authentication
                auth_basic_user_file /var/www/mydomain2/protect/.htpasswd;

                # pass the PHP scripts to FPM-PHP
                location ~ \.php$ {
                        fastcgi_split_path_info ^(.+\.php)(/.+)$;
                        fastcgi_pass unix:/var/run/php5-fpm.sock;
                        fastcgi_index index.php;
                        include fastcgi_params;
                }

                # deny access to .ht files like .htpasswd
                location ~ /\.ht {
                        deny all;
                }
        }

        ...
        # pass the PHP scripts to FPM-PHP
        location ~ \.php$ {
                ...        
}

Restart Nginx so the changes above can take effect.

sudo service nginx reload

Test by browsing to “http://mydomain2/protect/” and the browser should prompt you to input a user and password.

Secure Mailman

To run Mailman under HTTPS/SSL, move the “location /cgi-bin/mailman” definition in the server block file, “/etc/nginx/sites-available/mydomain2″, from the HTTP server to the HTTPS server section.

You will also need to modify Mailman to use the HTTPS url:

# Edit Mailman's configuration
sudo nano /etc/mailman/mm_cfg.py
   # Change its default url pattern from 'http://%s/cgi-bin/mailman/' to:
   DEFAULT_URL_PATTERN = 'https://%s/cgi-bin/mailman/'

# Propagate the HTTPS URL pattern change to all the mailists
sudo /usr/lib/mailman/bin/withlist -l -a -r fix_url

Note: It is not necessary to restart the Mailman service for the changes above to take effect.

If you only want the default URL Pattern change to apply to a specific mailing list, like “test@mydomain2.com”, use this command instead:

sudo /usr/lib/mailman/bin/withlist -l -r fix_url test -u mydomain2.com

Take a Snapshot

Digital Ocean provides a web tool to take a snapshot image of the VPS. I can restore using that image or even create a duplicate VPS with it. Because my VPS is now working the way I need it to, it makes sense to take a snapshot at this time.

Unfortunately, performing a snapshot requires that I shutdown the VPS first. More unfortunate, the time required to take the snapshot varies from minutes to over an hour (more on this below). Worst, there is no way to cancel or abort the snapshot request. I have to wait until Digital Ocean’s system completes the snapshot request before my VPS is automatically restarted.

digitalocean_snapshot_stuckI did my first snapshot after getting WordPress working on the VPS. There was about 6GB of data (including the operating system) to make an image of. I shut down the VPS and submitted a snapshot request. The “Processing…” status with zero progress was what I saw for over one hour. During this time, my VPS and WordPress site was offline.

A little over an hour later, the status went from “Processing…” with zero progress to done in a split second. My VPS and WordPress site were back online. I think an hour to backup 6GB of data is excessive. Digital Ocean support agreed. Evidently, there was a backlog on the scheduler and requests were delayed. Because I couldn’t cancel the snapshot request, I had to wait for the backlog to clear in addition to however long it took to do the snapshot.

If I had known more about the snapshot feature, I would have opted to pay for the backup feature, which cost more but doesn’t require shutting down the VPS. Unfortunately, the backup feature can only be enabled during VPS creation so it is too late for me.

The recommended method to shutdown the VPS is to run this command:

sudo poweroff

Update: I just did a snapshot and it only took 5 minutes this time.

See my followup post, Upgrade Ubuntu and LEMP on an Unmanaged VPS, to learn how to upgrade LEMP and Ubuntu to the latest versions.

Most info above derived from:

1 Comment

« Previous Entries