About time

I am fascinated by science. It defines, refines, and changes our understanding of our very existence.

Changes in understanding both great and small have included insects and mice springing spontaneously from rotting food and undisturbed left over grain, the earth going from the center of the universe to just a planet orbiting a star in a solar system orbiting a galactic core, and the sun being the only star in the universe with planets to stars with planets being common.

To follow a scientific method everything that’s defined must be observable and repeatable by independent researchers.

So how are things defined that aren’t testable? Things like time. Wait, what? We all know what time is, yes. We experience the flow every day. Our watches measure it. Our phones, computers, and networks measure it to enable communication. Scientists predict eclipses and tourists flock to see them based on the predictions the scientists publish. It all seems to work.

Then you read something like this, Do We Actually Experience the Flow of Time?

Now what.

Passwords: Make it safe

Got hacked, locked out of files and accounts? It happens to lots and lots of people.

A few people are actual selected targets. A small minority I believe. The others? They’re the “catch” the result of cyber criminals casting a wide net with their tools.

When I talk with people about safe passwords they often say things like “I can’t remember so many” or “It’s too hard to come up with good memorable passwords” and often “I just don’t understand how to manage it”.

To them I say a password manager is your friend and protector. Refer to this article, Why You Shouldn’t Use Your Web Browser’s Password Manager, for useful information about password managers.

A few things that I see a bit differently than the article.

First, I disagree with the basic premise. In my experience the best way for people to start doing something new is to start from where they’re already at.

So if you want to use the password manager in your web browser then go ahead. You must stick to using that browser. If you already do that why not stick with it?

Second, I disagree that the open source password managers mentioned are more complex than the password managers mentioned, especially if you already store files on the cloud with Dropbox, Google Drive, etc.

And I see an advantage for the separate password manager. If you use a password manager like KeepassX and the file sharing site gets hacked you’re still the only one with the password database’s password. If your online password manager site is hacked then all your passwords are compromised.

In the case of using your browser or something like LastPass to manage your passwords an account must be created with the provider of the password management service. Essentially only one layer of protection.

If KeepassX or something similar is used there’s two layers. The file sharing website and the password database itself.

Multi layer protection is where it’s at baby! (said in Austin Powers voice)

The most important part of all this is to set up different complex passwords for each site you use.

Use your browser’s password manager, an external service like LastPass or a separate password manager like KeepassX combined with an online file storage service to create unique complex passwords for each site you use and you’ve improved your security by leaps and bounds.

Virtual Host??

Setting up Apache to support multiple websites on one host. My server already does that for my public websites.

However I want to control what is returned to the browser if a site isn’t available for some reason. So I’ve set up a virtual server with multiple sites. Each site works when enabled. However if the site is set up to be unavailable, disabled, no index file, etc. the default page returned to the browser is not what I’d like.

Need to identify a few fail conditions, see what the server returns when the condition exists, see if what’s returned for a given condition is the same regardless which site the failure is generated by, then figure out why the webserver is sending back the page it does.

Reasons not available:

  • site not being served, e.g. not enabled on server
  • site setting wrong, e.g. DocumentRoot invalid
  • site content wrong, no index file

Answers that might be returned:

  • site not available
  • forbidden
  • …other’s I’ve seen but don’t remember now

From what I’ve read it seems whatever’s in 000-defalut.conf should control which page/site loads when a site isn’t available. That’s not the result I’m getting.

Either I’m doing it wrong or I’m just not understanding what’s supposed to happen and how to make it happen.

More digging…

VBoxManage

Important VirtualBox command to be familiar with. Get virtual machine info that can be copy pasted into documents and other commands.

vboxmanage list runningvms

Also display running machine properties without having to navigate the UI. Good for quick review of network settings too.

vboxmanage showvminfo "VWebHostTest" | grep "Name: \|Rule"
Name:                        VWebHostTest
NIC 1 Rule(0):   name = Web8000, protocol = tcp, host ip = , host port = 8000, guest ip = , guest port = 8000
NIC 1 Rule(1):   name = Web8001, protocol = tcp, host ip = , host port = 8001, guest ip = , guest port = 8001
NIC 1 Rule(2):   name = Web8002, protocol = tcp, host ip = , host port = 8002, guest ip = , guest port = 8002
NIC 1 Rule(3):   name = Web8003, protocol = tcp, host ip = , host port = 8003, guest ip = , guest port = 8003
NIC 1 Rule(4):   name = Web8004, protocol = tcp, host ip = , host port = 8004, guest ip = , guest port = 8004
NIC 1 Rule(5):   name = ssh, protocol = tcp, host ip = , host port = 2223, guest ip = , guest port = 22
NIC 1 Rule(6):   name = web8080, protocol = tcp, host ip = , host port = 8080, guest ip = , guest port = 8080
NIC 1 Rule(7):   name = web8800, protocol = tcp, host ip = , host port = 8800, guest ip = , guest port = 80