Picking a Monitor.

Whether you are looking to get a new desktop or build your own gaming rig, one of the most critical decisions is what monitor to choose. Often times, this is something we don’t think about and simply buy the cheapest one. The mistake with that logic is you already paid for a high-tech monster gaming computer but are playing on an old style, low-resolution square computer monitor wondering why you spent the money on a rig that isn’t much better than your old Dell. Here is what you need to keep in mind when choosing your next monitor.

Size / Resolution:

These two go hand in hand and therefore we will talk about them together. If you are simply using your computer for word processing and doing your taxes once a year you probably don’t need to put much thought into the size or resolution. However, if you want to watch movies or play games these are two critical components of your decision. The larger the monitor and higher the resolution, the more powerful GPU (Graphical Processing Unit) you are going to need. You can’t get a low-end gaming rig and expect to run a 42″ 4K monitor. The computer won’t be able to keep up. Balance is critical. Make sure you check the specs on your graphics card to ensure it allows for 4K gameplay.

It is easy to find a monitor 24″ or larger with a 16:9 aspect ratio (full HD or 1080). If you are new to gaming and own a lower-end rig or are just watching movies,  a 24″ monitor with HD is probably the route for you. They are not very expensive and will do the job.

Refresh Rate:

Most monitors come in 60Ghz. This indicates how many times the monitor displays a new frame per-second. The higher the number the more frames and the smoother the action is. If you are a gamer and like fast paced shooters, you want to look for a monitor with 120Ghz or above. This will keep the picture clear(er) during the twitch style gameplay.

Screen Brightness / Contrast:

Think about the room you will have your computer in. Does it have a lot of windows? Is it mostly dark? Do you work all day and only play at night?

If you are in a bright room you will want to make sure your screen is able to keep up with with a high brightness level or candelas. The higher the candelas the better. Contrast is the difference between pixel colors. In other words, the higher the contrast ratio the bigger of a difference there will be between true black and “not so true but close black”. If your computer has a low contrast ratio you may not be able to see a difference between those two colors.

Both of these features can be adjusted after purchase through the on-unit menu.

Ports / Extras:

The last factor in your decision should be how many USB ports, HDMI connectors, DisplayPorts, etc are included on the monitor. Most often, having an HDMI or a DisplayPort (which does the same thing as HDMI) and a few USBs is sufficient, however think about what you want to connect to it and just make sure it has what you are looking for. If you have your tower (or rig) set behind your desk and it is easier to connect your mouse to a USB port on the monitor, think about how many other things you would like to connect (headphones, mic, etc.) as well.

Many monitors also come with camera, built-in microphones, and touchscreen capabilities.


Picking a monitor isn’t rocket science. There aren’t too many options and it’s easy to find what you want if you know what to look for. The most important thing to consider is your use. What do you intend to do with the monitor? If you are a gamer you may want to get a 32″ 4K 144Ghz beast. If you only need it to store pictures and write e-mails a simple 20″ HD with touchscreen capabilities will do the trick.

What is RAM Anyway?

Just about everyone nowadays owns a computer. Inside of which are many parts. Each serves a specific purpose, which when all working together, allows your computer to run smoothly.

One of these parts is called RAM. RAM is actually an acronym which means Random Access Memory. This component serves as the temporary storage center of your computer, while it utilizes the information to perform tasks.

An easy was to understand this process is to compare it to cooking. When you cook you take out the ingredients from the fridge and pantry. The fridge and pantry are like the memory of the computer, or the hard drive. You, or the CPU (Central Processing Unit), take out the ingredients and lay them all out on the counter.

If you have a small amount of RAM or 4GB you have a very small counter and it is hard to find everything you need quickly. If you have a large amount of RAM or 16GB+ you have a large counter where everything is easily laid out in front of you.

You grab the ingredients and cook your meal just as the CPU takes the information bits it needs and runs the program.

In regards to how much RAM you need, that is a pretty easy question. If you plan to do something that requires a lot of processing power or graphical power such as gaming or graphic design, you should probably stay around the 16GB or 32GB mark.

If you only use your computer for word processing, you will probably be safe around the 8GB. I don’t recommend anyone go below 8GB as most modern programs require at least 8GB (many ask for 16GB).

2.4GHz or 5GHz, what?

If you own a newer wireless modem or router you may be faced with a choice.

When you bring up the list of available WiFi networks in your home you may see a variety of options, how do you know which one to pick?

Screen Shot 2018-03-28 at 12.23.44 PM

Don’t fret, the choice is pretty easy if you know what to look out for.


First, start by identifying which of the options belong to your network. Oftentimes your modem or router (whichever broadcasts your WiFi) will have a sticker on the bottom or side of the unit with the name of the network.

Second, once you’ve identified which network belongs to you, you may notice there are two similar but slightly different options: 2.4 or 5.

The two numbers at the end of your network name refer to the gigahertz in that band. The WiFi broadcasting devices that uses these two bands are called dual-band and offer you the ability to switch between two network bands depending on the device you are using and where you are using it.

The 2.4GHz band is designed for devices that may travel away from the WiFi network and require a longer range. The speed for these devices will be slower, but the signal can travel further.

The 5GHz band is designed for a shorter range but the network is faster.

Let’s say you have your WiFi hub in your living room. The TV in your living room is a SMART TV and connects to your network via WiFi. You would want that on the 5GHz band. The TV is stationary, close to your WiFi signal, and could use a boost in speed. Your daughter has an ipad and uses it in the upstairs loft. She likes to walk around with it and fire small tubby birds at seasick pigs. You would probably want her device to be connected to the 2.4GHz band.

Deciding which band to use goes a bit deeper with the strength of your network, the location of your devices, and if you have any relay hubs or signal boosters throughout your home. But the rule of thumb is: if the device is close to the signal and is stationary use the 5GHz band, and if the device moves around use the 2.4GHz.

What to look for in a TV

So you want to buy a new TV? Great, we’ve already posted our picks for TVs at the beginning of 2018, and you can find that article here. However, for all of you techies out there who like to find their own way through the TV mess, here are some helpful tips for what to look out for.


Just about every modern TV out there is going to be one or the other. LED stands for Light-Emitting Diode. This is essentially the backlight for the screen which gives you the picture. The main downfall to LEDs are their size. They are too large to be a single pixel and therefore cause issues when you have high contrast images, for instance a dark scene inside a cave. The biggest redeeming factor for LEDs is their ability to glow very bright, which has its benefits in bright rooms. OLEDs are very small and can be made the size of a single pixel, allowing them to be shut off individually. This ability gives you a true black tone to your picture, which increases your contrast and helps you to discern between similar colors.

HD vs 4K

HD is a technology familiar to most of us. It is a reference to the amount of pixels on the screen giving you clearer definition. It comes in either 720p, 720i, 1080p or 1080i. The “p” stands for progressive scan and the “i” stands for interlaced. The biggest difference between them, is the “p” will have a crisper picture during faster paced action scenes whereas the “i” may blur during those same scenes. 4K is also a reference to how many pixels are on the screen. In fact, a 4K tv has 3840 x 2160 pixels while 1080 only has 1920 x 1080. What that means to you is the more pixels the sharper the image. If you are like me and have to have the sharpest picture, make sure to buy the 4K. Besides, the price difference isn’t anywhere near where it used to be.



HDR stands for High Dynamic Range. This technology greatly improves both the contrast ratio and the color quality. In fact when consumers were tested and asked to pick between a 4K TV with no HDR or a 1080p with HDR they almost all chose the 1080p TV and said that it had greater picture quality. HDR gives your TV a much truer tone and therefore a much greater picture quality. Couple HDR with 4K and you have a great looking TV you can be happy with for years to come.

Refresh Rate

This is how many times the screen changes within a single second. The standard TV has a refresh rate of 60Hz however many can be purchased at 120Hz. This means that your TV either cycles frames at 60 frames per second or 120 frames per second. There is a kicker: unless you are a gamer and you hook your TV up to your computer, 60Hz is all you will ever need / get. Nearly all providers of Television Service only broadcast in a 60Hz format. This means that even though your TV is capable of operating at 120Hz it will still only show at 60Hz. Some manufactures offer technology boosts, for instance Samsung’s Clear Motion, which raises your Hz from 120 to 240. There is a slight, but noticeable, difference but nothing worth going out of your way to find, unless you absolutely must have it.

SMART Capability

SMART TVs are becoming the norm. This technology allows you to use apps like Netflix or Hulu without having to connect an outside device such as a ChromeCast or Roku.

WiFi and / or Ethernet

A WiFi capable TV is very common on today’s market. The biggest issue with the technology is the upload / download speed is generally pretty slow and can sometimes cause delays, or slow loading times when booting up an app on your SMART TV. This is where an Ethernet capable TVs comes in handy. Being able to bypass the built in WiFi and simply plug the TV into your modem or router will increase your speeds considerably. WiFi will do the job though so no sweat either way.

HDMI / USB Ports

Not a lot to be said here. You just want to make sure you take note of how many are available and ensure that it is enough to fit your extra devices. If you have an Xbox One, a Ps4, and a Switch you will need to make sure there are enough HDMI ports for each device plus your cable box. Otherwise you will have a fun (and frustrating) time changing out the cords manually when you change devices.


After reading through these key points, if you decide you would rather not attempt to memorize all the different options and would rather let us point you in the direction of some reliable choices, head on over here to see our “Good”, “Better”, “Best” picks for TVs in early 2018. All of our picks are great options and based on your price point you would have a great product no matter which you choose.

Good luck out there!