Friday, January 30, 2015

Why not use a TV as a monitor?

So, large screen LED backlit LCD televisions are amazing. They look great, and they've become quite affordable over the past few years.

Large computer monitors on the other hand, are still really expensive... Though the definition of "large" has changed over the past ten years from 17" to 19" to 21" to 24", to 27", and for the last couple years has topped out in the range of 30"-32". That's about the most people can comfortably fit on their desktop, and still be able to see the screen edge to edge without turning their heads.

Most people who want more screen area, just get two 24" monitors, because they don't need to look at a single big high resolution thing very much. Those who do, buy the 27" to 32" monitors bite the bullet, and pay the extra money.

So, since 32" tv's are so cheap, and so great, why not just buy one of those, or even a 37" or 40" etc... for a big monitor?

Yeah... why not use a flat screen TV as a monitor?

At first glance, it makes sense, and for some people in some applications, it absolutely does. But, there are a number of factors you need to take into account.

First, the sharpness, contrast ratio, luminance, and response time,  are generally considerably better on decent computer monitors than on most televisions.

Those qualities are expensive, and get more so as screen size goes up. They're expensive on TV's as well, but to show a good HDTV image, you don't need as good a screen (technically a display panel).

Also, the image processing, color gamut, panel design and the like, are generally different between monitors and televisions.

Computer monitors are designed to display text and graphics very precisely but not necessarily naturally. PC monitors panels and image processing are designed to make text look very clear, readable, and high contrast, without being oversharp, especially black text on white backgrounds. Monitors (at least higher quality ones) are also generally designed to display a much broader range of colors, more accurately (meaning with less bias or distortion, at consistent brightness and contrast, across the whole panel. Better quality monitors can also be color, contrast, and brightness corrected and calibrated, and can adjust the display in ways to correct common computer display issues. Of course, how well they actually manage to do those is another question).

Televisions generally have panels and image processing designed to make live video and film images, especially fast moving images, more natural looking. They are also designed to make bright objects against a dark background look better, without popping or ghosting, and with a "natural" appearing color, softness, and motion blur. They also have different screen adjustments to address issues typical to HD video signals rather than computer display signals (overscan, position correction, aspect ratio etc...).

This by the way is one reason why professional video production monitors are much more expensive than normal televisions. They are designed with all of the capabilities of the best quality PC monitors, AND the best televisions, plus additional inputs, image controls, and correction and calibration capabilities (as well as some other things like refresh and color/refresh/channel sync lock).

Isn't 1080p good enough?

The issue isn't resolution... 1080p is good enough... at the right screen size and viewing distance.

The issue is our vision, contrast, pixel size, and viewing distance.

PC monitors are designed be looked at closely, from close distances. TV's are designed to be watched from farther away, looking at the whole screen at once.

A 1920x1080 screen at 50" vs a 1920x1080 screen at 24" have VASTLY different PPI (pixel per inch). The big screen will be 44ppi and the smaller screen will be 92ppi.

My phone has a 5.1" 1920x1080 screen at  432ppi.

That's a .6mm, a .28mm and a 0.06mm pixel size respectively.

Whether a particular pixel count is acceptable or not, depends on how far away your eyes are from the screen.

The average human can see an individual .6mm pixel in a contrasting field (one white pixel on black) from appx 78", a .28mm pixel from 37" and a 0.06mm pixel from about 8".
You can get these numbers for yourself for any screen size here: https://www.sven.de/dpi/  and here: http://isthisretina.com/

So what's the difference and what's the damage?


In the real world, you don't need super high quality, ultra contrast, ultra high pixel densityetc... for general computing. A 24" 1080p screen looks just fine for most things, including video, most gaming, and small text.

Cheaper 1080p screens in the the 32" size range are now available for $250 or so. They look fine for displaying HDTV. Unfortunately look like crap when used as computer monitors from typical chair to screen distances. The text can be difficult to read, window edges look weird, games look weird, colors don't look right, black and white look like grey and brighter grey (usually with some blue, green, or yellow mixed in).

That said, they're not entirely unusable, and for video, large text, status displays etc... they're fine for that. They're also fine at 6 feet away. A slightly more expensive TV in the same size range may also have more controls and options so you can calibrate and compensate and make it look better.

30" to 32" computer monitors start at twice that (for 1080p screens), and go up into the $5,000 range (for 4k 4096x2160). They also look great at typical chair to screen distances for anything but small text (at 1080p).

For small text, even 27" is iffy at 1080p. For a 27-32" you really want something like 2048x1152, 2048x1536 or 2560x1440, or go to a QHD/4k.

Professional video production monitors in the 30-32" range on the other hand START at around $5,000 for 1080p, and go into the $30,000 range for 4k. They look amazing at any distance (once you get used to a professional video screen, other screen look like crap in comparison).

If there weren't good reason for it, no-one would spend that extra money.

Wednesday, January 07, 2015

Meaning and Understanding

In order to communicate usefully and meaningfully (is anything less really communication?), one must be able to understand what others say, and they must be able to understand what you say.

More importantly, you absolutely must understand what they MEAN.

Obvious yes?

So then why are so many people attempting to make it so hard for others to understand them?

In order to communicate with someone, you must have shared meaning with them.

You must have shared definitions, shared context, shared points of reference; or you must be able to create these things, in your interactions with them.

You must be able to relate things in your own life and experience, to similar things in theirs, and be able to explain the differences (you must be able to share idiom and to analogize).

Further, you have to know where you have shared meaning, and where you don't. Otherwise you might say one thing, and they'll understand (or misunderstand), something else entirely.

It's a case of not being able to ask the right questions, because you don't know, what you don't know.

I am a member of several different subcultures, where individuality, the "unusual", the extreme, the outliers... are "common", even celebrated.

However, these are also subcultures which tend to infinitesimalize relatively small differences. To create terminology for them. To inhabit them, wrap identities around them, and unfortunately too often factionalize around them (look up "the narcissism of small differences")

For all these reasons, and many more, it is especially important that we be able to communicate clearly. That when we say things of significance, we are operating with a set of shared definitions and assumptions. That we have shared meaning, around our actions and interactions.

The potential for hurt or harm is so great, the need for clarity is all the greater.

The difficulty is, often, our cultural assumptions are transparent to us; and utterly alien to others outside of our culture (or subculture).

In most subcultures, "Good morning" is a friendly greeting, and "Hey, fuck you" is a horrible insult.

MOST subcultures, but not all...

"Hey, fuck you", IS a warm friendly greeting, in some subcultures...

The military, commercial kitchens, athletic fields, construction sites... Really anyplace where people (mostly guys) "busting each others balls" is part of the culture of comradeship and respect.

It's when the guys DON'T insult you, screw with you, bust your balls etc... that they are expressing their dislike or lack of respect for you. It means they don't care enough to bother, don't respect you enough, or don't think you can take it.

You wouldn't BELIEVE some of the insults my friends and I have for each other... never mind the dynamic between older and younger brothers...

But... knowing that, and being able to deal with that, depends on shared cultural understanding, and therefore having shared meaning and context.

If you're a polite upper middle class American woman, and you're suddenly dropped into a world, where people express respect and affection for each other by calling each other "bitch", "whore", "faggot" (certain gay subcultures for example)... You're probably going to be appalled, you will likely be offended, and you're certainly going to have a hard time understanding what is being communicated, and communicating in return.

Until you develop shared meaning and context.

This is something that an unfortunate number of folks in "alternative lifestyle communities" seem to miss... (and others as well, I'm just using this as a convenient and obvious example).

They seem to carry around the assumption that somehow, everyone is supposed to understand their exact individual and specific meaning for something, which may mean something entirely different to someone else... and they get offended when you don't.

There are these terms, that they make up entirely, or use differently from everyone else; and yet they seem to believe they have the right to be offended when others don't understand or "respect", their personal meaning or usage... and to force other people to use it while attempting to communicate with them (or worse, to refuse to attempt to communicate with anyone, unless the other party already understands their preferred usage).

Then of course there are those who, in reaction to the type of person I describe above, and in the attempt to not give offense; account for EVERY POSSIBLE OPTION, COMBINATION, OR VARIANT, IN EVERYTHING THEY SAY...

Can you tell that irritates me...

It's a terrific irritation, and waste of time, and just plain destructive to real communication and understanding.

This is one of the problems I have with people who keep trying to find infinitely small divisions of categorization for their "identity", or their gender, or their sexuality, or their ideology or any other damn thing; particularly those who get offended if you don't use, or don't understand, their preferred term for their self identification.

Fine, you may want to call yourself "queer oriented transgenderflexiblequestioning blondie"...

...but unless someone has direct personal knowledge of the multiple subcultures I drew those descriptions from, and the tiny shades of difference between multiple terms, no-one is going to have the slightest clue what you are on about. You're just going to irritate them, and make communication with them more difficult.

And sorry, no, everyone does not have an obligation to "respect your choices and preferences".

Neither your mere existence, nor your particular preferences, create any obligation for me to do ANYTHING WHATSOEVER, except not trespass on your fundamental rights. Everything else is optional, and a matter of cultural practice and social convention.

If you are explicitly and deliberately using language, terminology, and definitions, outside of cultural practice and social convention... How exactly is anyone supposed to know what to do, how to treat you, what to call you etc... ?

One shouldn't need to be an Oxford don of linguistics and semiotics, to understand what it is you wish to be called, what your interests and hobbies and preferences are, what you don't like etc...

How about this...

Those of you who are so concerned about others getting your "label" wrong?

Is your own sense of self worth, and identity, so weak, that it cannot tolerate others not uniquely and specifically acknowledging and reinforcing it?

How about you like yourself, respect yourself, and respect others enough; to not give a damn about labels and terminology, except as a way of facilitating meaningful communication and understanding?

How about you try not getting offended, and instead try to help other people understand you better... and try to understand them better?

Labels CAN be important, to facilitate communication, to speed things up, and to reduce the potential for misunderstanding... but you know what's more important? Shared meaning, shared context, and shared understanding.

In that same vein, definitions ARE important. Critical in fact.

The potential for harm inherent in misunderstandings in this world... It's just too great, to make the risks even higher through miscommunication and misunderstanding.

If you don't know the definition of an important point, clearly and completely, it's absolutely critical you ask.

If the meaning of an important point is ambiguous, or there are multiple equally valid meanings... particularly if they are contradictory; it is critical to reach shared understanding and clarity.

When the meaning of a word, phrase, term etc... is well understood in a particular subculture; it's incumbent on you to understand and use that definition, when dealing with members of that subculture, in their "own house". When dealing with those outside your particular subculture, you cannot expect them to automatically know and use your own specific definitions and meanings, which are different from their own.

Or is that just too hard?

Saturday, January 03, 2015

The Minimum Wage Lie

When “progressives” say “the minimum wage hasn’t kept up with inflation”, they're lying.

Not shading, the truth, exaggerating, or interpreting things differently… they are flat out lying.

… And what’s more, the ones who made up the lie in the first place, know they're lying (the rest mostly just parrot what they’ve been told).

What exactly would “keeping up with inflation” mean?

The minimum wage has been $7.25 an hour since 2009.

In 1938, when the federal minimum wage was established, it was $0.25 an hour. In constant dollars (adjusted for inflation) that’s $4.19 as of 2014.

So, not only has the minimum wage kept up with inflation, it’s nearly doubled it.

Ok.. well what about more recently?

Minimum wage 15 years ago in 2000: $5.15, or $7.06 in constant dollars

Minimum wage 20 years ago in 1995: $4.25, or $6.59 in constant dollars.

Minimum wage 25 years ago in 1990: $3.80, or $6.87 in constant dollars.

Minimum wage 30 years ago in 1985: $3.30, or $7.25 in constant dollars.

Funny… that’s exactly what it is today… How shocking.

So, for 30 years, the minimum wage has not only kept up with inflation, for most of that time it’s been ahead of it.

So, how are they lying?

The way “progressives” claim minimum wage hasn’t been “keeping up with inflation”, is by comparing today, with the highest level it has ever been; almost 50 years ago, in 1968, when the minimum wage went to $1.60 an hour ($10.86 in constant dollars).

This was a statistical anomaly.

There’s a long and loathsome tradition of lying with statistical anomalies.

At $1.60 an hour, the minimum wage in 1968 was a huge 20% spike from what it had been just 3 years before in ’65, more than 40% above what it had been in 1960, and nearly double what it had been 12 years before in 1956 when politicians started throwing minimum wage increases faster and bigger (again, all in constant dollar terms. The minimum wage at the beginning of 1956 was about $6.30 in constant dollars)

In constant dollar terms, the minimum wage today, is about the same as it was in 1962 (and as I showed above, 1985).

It just so happens that from 1948 to 1968 we had the single largest wealth expansion over 20 years, seen in the history of the nation (about 5-8% annual growth)… Which then crashed hard starting at the end of ’68.

From 1968 to 1984, the U.S. had 16 years of the worst inflation we ever saw, and the purchasing power of ALL wages fell significantly, as wages failed to come even close to keeping up with inflation (we saw 13.5% inflation in 1980 alone, which is about what we see every 4 years today).

It took until 1988 for real wages to climb back to their 1968 constant dollar level, because we were in a 20 year long inflationary recession, complicated by two oil shocks and a stock market crash (actually a couple, but ’87 was the biggest one since ’29).

However, the minimum wage was boosted significantly in that time period, far more than other wages rose, and stayed above the 1962 water mark until the end of that high inflationary period in 1984, declining slightly until 1992, then spiking and declining again until 1997 etc… etc…

By the by… household income in 1968? appx. $7,700, which is about the same as today in constant dollar terms… About $51,0000 (about 8% more than it was in 1967, at $47k). Which is almost exactly what it was in 1988 as well. Household income peaked in 1999 and 2007 at around $55,000, and troughed in 1975 at around $45,000

Of course, income was on a massive upswing from 1948 to 1968 (and in fact had been on a massive upswing overall since 1896 with the exception of 1929 through 1936). In 1941 household income was about $1500 ($24,000 constant), in 1948 $3,800 ($37,000 constant).

Like I said, it was the single greatest expansion in real income and wealth over a 20 year period, in American history.

1968 was a ridiculous historical anomaly… Not a baseline expectation.

So, From 1964 to 1984, the minimum wage was jacked artificially high (proportionally far above median wage levels), and “progressives” chose to cherry pick the absolute peak in 1968 from that part of the dataset, in order to sell the lie.

A living wage?

As to the minimum wage not being a living wage… No, of course its not. It never was, its not supposed to be, and it never should be.

The minimum wage is intended to be for part time, seasonal workers, entry level workers, and working students.

Only about 4% of all workers earn the minimum wage, and less than 2% of full time workers earn the minimum wage.

Minimum wage is what you pay people whose labor isn’t worth more than that. Otherwise everyone would make minimum wage. But since 98% of full time workers can get more than minimum wage, they do so.

What should the minimum wage be?

Zero.

Wait, won’t everyone become poor suddenly?

No, of course not. Literally 98% of full time workers already get more than minimum wage. If we abolished the minimum wage, most of them wouldn’t suddenly be paid nothing.

Wages should be whatever someone is willing to work for. If you’re willing to work for $1, and someone else isn’t, you get the job. On the other hand, if an employer is offering $10 and no-one is willing to take the job for that, they need to offer $11, or $12, or whatever minimum wage someone is willing to take.

If you don’t want to work for $7.25 an hour, don’t take the job. If nobody offers you more than that, too bad, but that’s all your labor is worth.

If you are willing to work for someone for $7.00, and they’re willing to pay you $7.00, what right does some “progressive” have to tell either of you, that you can’t work for that much?

No-one is “exploiting the workers”, if those workers took the jobs voluntarily, and show up for work voluntarily… If all you can find is a job for less than what you want to work for, you’re not being exploited, THAT’S ALL YOUR LABOR IS WORTH TO THOSE EMPLOYERS.

You may think your labor worth more, but things aren’t worth what you want them to be worth, they’re only worth what someone else is willing to pay for them.

But let’s be generous…

All that said, I don’t think we’ll be able to eliminate the minimum wage any time soon.

So, to those “progressives” who would say “let’s make the minimum wage keep up with inflation”, I agree wholeheartedly… Let’s make it $4.19.

Oh and if you don’t believe me on these numbers, they come from the department of labor, the department of commerce, and the census. If I’m lying to you, it’s with the governments own numbers… the same ones “progressives” are lying to you with.