Sunday, December 14, 2014

Connection Monitoring Software

My last post "Connection Quality" went into technical detail about how a slow Internet connection can be troubleshot to determine the cause of the slowness.

So here is my idea on a software program that will work with any application that uses the Internet for connectivity such as streaming multimedia, remote connectivity such as Citrix, VMware View, RDP, and online gaming to name a few.

Lets start with an example of a Citrix session using the ICA protocol. A student is in a coffee shop and connects to the schools Citrix farm where they have access to an online portal with different apps for their classes. The student has very poor performance and calls the help desk. After some troubleshooting over the phone the help desk determines that it is the Internet connection at the coffee shop.

Now lets introduce the concept of software the not only monitors the quality of the connection but will also pinpoint where the issue is.

A small piece of executable would run as a service or in the background all the time. This would sense when a network connection is open that the software has the ability to monitor. It would then present a small graph or meter overlay on the screen somewhere not in the way. This would only show when the connection first starts or when the quality is bad or changes. These display settings would all be configurable including the thresholds to what is a "bad" or "good" connection.

Because the Citrix connection is connecting to a server at the schools data center we need to monitor the entire path. This means pings, traces, quality of service, throughput on the protocols being used, dropped packets etc all along the route. The software may identify that the issue is with the router or Internet connection itself at the coffee shop. The software is smart enough to not only test the speed and latency from inside the coffee shop but also ask an external server to test the path from outside the firewall. The results are then compared and it verifies that in this instance it is indeed a slow connection at the coffee shop.

The user is then notified that they should use a different Internet connection otherwise they may have a very poor experience with their Citrix connection.

Now lets say the issue was downstream with another router. The software would identify that and alert the user that there is an issue along the path and they may notice a slow connection. It will suggest trying another Internet connection. If the route is nearby possibly allowing them through a proxy (likely a premium service that the school might provide or the user can pay for) that will take a different path and reroute their traffic around that problem router.

All of this information is helping the user, reducing help desk calls and more importantly is reporting anonymously back to the central servers what routers are causing issues. ISPs all over the world can subscribe for free to this database for alerts. So when a router they are responsible for is an issue they will be notified. Thus improving the experience for everyone around the world.

It is even feasible that the central servers when they see a large amount of issues on particular routes can automate signals to subscribing routers to "poison" a route making routers go around that route instead of continuing to be a problem.

A website that would be accessible to the general public would graphically represent issues around the world as well as general statistics to what sites are busy, what games people are playing the most, etc, etc.

The primary revenue stream for this product would be that the software itself is paid for by the end connection point owners. In this scenario the school. The end users would benefit from being notified of a bad connection and the connection point owner would have reduced help desk calls to troubleshoot bad connectivity. Additional revenue streams such as a proxy to go around troubled routers and provide caching services could be paid for by the end user. In this case the student. If the school wanted to provide this service they could provide it only when connecting to their services.

I would suggest a free version that offers basic monitoring of the connection quality but does not get down to the protocol level and will only monitor out to a few hops. This will keep check on the quality of the local Internet connection. If they connect to a resource that pays for the service it will automatically engage the full feature set of quality monitoring and proxy services if that is also paid for.

Tuesday, December 9, 2014

Connection Quality

The world today is so connected that a computer is almost useless without an Internet Connection. But many people do not have a good connection to the Internet. More specifically the route they have from their computer is to the backbones of the Internet is slow. But what is slow? A slow or poor Internet connection is based on many factors but ultimately it is the perception to the end user of having to wait. Here are some common issues that cause "slowness":

Latency: How long it takes for a single packet to make a round trip in milliseconds. Less is better, anywhere from less than 1ms for a local network to 200ms for an Internet connection. Typically you want this to be under 60ms for a "good" connection.

Jitter: The amount of delta from one ping to the next, generally this is an average. Less is better, under 20ms is good for most connections.

Packet Loss: How many packets are lost in transmission. Less is better, should be well under 1%.

Hops: How many routers a packet must travel through. Less is better. There is more chance of packet loss with more hops.

As you can see there are many factors involved in that little TCP packet that goes from your computer to the server for data and back. TCP is a reliable protocol whereas UDP (normally for streaming video and music) is an unreliable protocol. TCP will re-transmit data when it does not receive it. There is also a handshake from end to end to confirm that the data was received. Along with this is check-sums to prevent corrupt data or incomplete data being received.

All this extra payload requires some overhead. Most do not realize that there is upload bandwidth used when downloading and vise versa. So when your ISP says you have a 10 Mbps download and a 512k upload, it is possible that your download speed is actually hindered by the upload limitation. For those that want to understand the technicality of this here goes...

A typical packet size is 1500 bytes (jumbo packets are larger and we are not going to talk about that here). A header is typically 40-60 bytes. The header has information on where the packet come from, where it is going, and what is in the packet. If the header is 40 bytes then that leaves 1460 bytes for data. This means a minimum of 2.7% overhead. Now if you are not using all the space in the packet, lets say only 100 bytes of data is sent then the overhead would be 40%. If you are doing something like playing an online game that is sending lots of small pieces of data you could have a high overhead and now your Internet connection is saturated with lots of small packets talking back and forth.

Programmers are smart and are pretty efficient with the transmission of packets. So if there is a request of data of say your position in an online game, instead of sending 10 small packets the software might combine the data request into 1 larger packet. Thus less overhead and if the connection is bad it is faster to retry sending 1 packet vs 10.


SOOOOooooo what can you do to improve your connection speed. There are lots of tricks out there including changing your MTU (maximum transmission units or packet size), installing software that promises to make your connection faster, or simply get a better ISP. First of all the MTU size is not likely going to do much as most network equipment will automatically pick the best MTU. Software installed on your computer is generally a good way to get malware or junk software. The best solution is to get the best ISP you can.

Money is not everything, so paying for the fastest connection might be a waste of that money. When you pay for a connection and there are different tiers of speed all the ISP is doing is "capping" the speed based on what you pay. For example if a cable company will provide 10 Mbps down and 2 Mbps up (I will refer to as 10/2 going forward, or down/up), 25/5, and 50/10. These speeds are controlled by software. So when your connection approaches these set limits it will slow you down. This is also known as packet shaping. Also some connection have a "burst" where the first say 10 MB (that's mega bytes) of data will download twice as fast and then it will slow down to what you pay. This makes customers happier and makes speed test results much nicer too.

The problem is the connection quality is the same, you just have a speed limit. Think of it as driving around a fast car but your not allowed over 100 mph or 80 mph. Do you really need to go that fast? If you check your email and surf the web then you need a basic speed connection. If you stream movies all day and multiple at a time then you need the faster speed. For gaming it really does not matter as the bandwidth is generally not that high. Latency is a more important factor.


How can you check quality of your connection? Here are a few sites that I use that will help.
http://www.pingtest.net/ (requires Flash and Java)
http://www.speedtest.net/ (requires Flash)
http://speedof.me/ (requires an HTML 5 browser like Chrome, Firefox, or IE 10+)

But these test are not showing the whole picture. They are testing your connection from your computer to their servers. What if the problem is not your connection. What if the issue is a router between you and your destination? A few quick troubleshooting test will help with this. Here is what I do to check my connection.

1. Go to several large websites to see if they all load slow. Like Google, Yahoo, MSN, Ebay, etc. If they all load slow then it is likely that the issue is with your computer or your connection.

2. Try another computer on the same network. Again if the same issue occurs then it is likely your connection.

3. Start a trace. In Windows open a command prompt (go to the run dialog and type CMD then press enter). Use the command tracert and then an address like this (without quotes) "tracert google.com". You will see a list of addresses and some will have names next to them. These are the "hops" or routers that your connection is sending packets through to get to Google.com. If you see an * meaning no reply don't worry as some routers will not reply, that is normal. Look down this list, if you see very high times in ms like well over 100 then that is likely where the slow down is. If those high numbers are at the beginning of the list the issue is close to you. If the numbers are near the bottom the issue is more toward Google's end.

4. Path Ping is another command that is good for troubleshooting. It is like a Trace Route but will do more analysis over the path with lots of pings. This takes longer to run but can provide more detailed information on where there is an issue.

5. Reset your network equipment as that generally never hurts. Do some of these test again to see if the problem is still there. If it is then you might have to contact your ISP. If the issue is further downstream then sorry there is nothing you can do about it. Just wait and try again later.

Each time you make a connection the path may change. Routers are constantly talking to each other to find the best route. So if one router goes down or a link is really busy they may all send out signals to find a better path. This could change the route and will "fix" that problem you had. Go ahead and try it. Do the same trace every day and at different times. You may find that the path changes a bit all the time. As ISPs add more bandwidth and more routers a better path may show up and the routers you talk to may send you on your way.


In conclusion the Internet is truly an amazing beast with massive redundancy everywhere. To take down the entire Internet is just about impossible and has been tried. Several attempts have been made to do this by targeting DNS servers around the world. This has caused slowness but never actually totally made the Internet inaccessible. DNS is what resolves a name like google.com to an address like 173.194.33.165. Think of directions to a house. If you know the persons name but have no address you will never find them.

So the next time you have slow Internet take a few minutes to understand what the issue is. Almost always it is going to be at end or with your equipment. Before you call tech support or freak out. Reboot your router, modem, and computer, basically reboot everything.

Friday, August 29, 2014

Thermal Cameras Everywhere... Beware...

Up until recently thermal cameras have been quite expensive and were generally only used in high end test equipment or military applications.

That is now changing with an announcement from Seek Thermal (1) where they will soon be offering a smartphone attachment for around $250 that will allow you to see temperature in real time on your smartphone screen.

The applications are endless! Just a few examples are:
Finding people or animals in the dark.
Checking the temp on the playground slide for you kids.
Looking for electrical shorts in a wall.
Looking for leaks in an air duct.
Testing the effectiveness of insulation.
Watching for forest fires.
And so much more...

Now as with all technologies people will always find a dark side. There are endless possibilities of abuse as well. Since this technology used to be difficult to obtain for most people it was not much of a concern. But with the significant lower price point more people will have access and thus more people will do bad things. This is just inevitable.

Here are a few ideas of malicious things you could do with a thermal camera:
Spy on people in the dark.
Look for a guard dog in a backyard to avoid.
Find a car in a parking lot that just parked so you know they are not likely to be back for a while.
Just use your imagination and think of the privacy implications.

Finally my last thought on this topic is that you may soon be able to tell what mood someone is in just by taking a thermal image of them. I belive it will not be long before there is an app that with a thermal sensor and possibly an IR sensor a user will be able to tell what mood a person is in. This could potentially change everything.

Let's say you want to ask the boss for money to find a project you're working on. Better check their mood first. Good, let's ask for money. Bad, let's wait. Very happy, let's ask for a lot of money.

I think this tech could also find its way into the dating scene. Imagine being able to read the body of the opposite sex. What they are feeling just by their thermal signature. We may have to wear thermal clothes just to protect ourselves from the curious or worse malicious.

References:
1. http://m.us.wsj.com/articles/smartphone-add-ons-offer-thermal-imaging-1408396425?mobile=y

Wednesday, August 6, 2014

Who is Responsible for an Algorithms Outcome?

I was reading an article today that stated:
 "A Hong Kong court today ruled that local businessman Albert Yeung Sau-sing can sue the company over its Autocomplete function"(1)

Google states:
"...that it couldn't be held responsible for the suggestions made by Autocomplete. It was, it argued, a “mere passive facilitator”, with its algorithm based on the content of previous searches."(1)

This is very troubling to me as I understand that even though Google originally created the intelligence that makes the Autocomplete function. The data that it provides is not from any Google employee(s) but rather users of their products.

So it begs the question. Who is responsible for the outcomes of algorithms and eventually as these algorithms grow, artificial intelligence - "AI"?

Take the human example. You have a child that you raised and taught right from wrong to the best of your ability. One day they say something that you never told them and it gets them in a lot of trouble. Maybe they heard it from a friend, or TV, or even a song. Regardless, is the parent responsible for them being in trouble? And I don't mean responsible in the sense of the child possibly being a minor, I mean responsible in the sense that they somehow made that child say the inappropriate comment.

So then take Google's Autocomplete. I love it and it saves me a lot of typing. The algorithm behind it is amazing and I thank Google for this wonderful technology. But is it a stretch to say that when someone types their name and a negative or even liable phrase is suggested by the algorithm is somehow the fault of the algorithm itself? Results are only as good as the data. Plus lets be honest, sometimes the truth hurts.

To take the side of the Hong Kong court and force Google to modify their algorithm and have an exception list would be ludicrous. Not only is it blatant censorship but to police the Autocomplete suggestions and deal with removal request would be a huge drain on Google's resources.

This does not just have to do with Google. This is a much broader notion and will be a larger concern as more intelligence is integrated into our daily lives with algorithms. Ultimately we will (may already) have AI making decisions all the time that will affect us, possibly adversely.

As with a child that you teach right from wrong, they are going to do right sometimes and wrong others. Your goal as a parent is to give them tools so that they can make the best decisions for themselves.

Is this not what an intelligent algorithm is intended to do? Don't we want our computer creations to be smart enough to gather data available to them to make the best decision? Can Google or other software companies really force an algorithm to be nice all the time and be perfect?

I say no, because humans are not perfect therefore our creations cannot be perfect. But should censorship and the ignorance of a few destroy these amazing pieces of code that make our world in my option a much better place?


References:
(1) http://www.forbes.com/sites/emmawoollacott/2014/08/06/more-privacy-woes-for-google-this-time-its-autocomplete/

Thursday, June 26, 2014

I Control the Keys to my Castle of Data

So much of our data is spread across systems all over the world. Whether it be Google, Microsoft, Apple, Yahoo, NSA, CIA, IRS, and so on. Your personal data is everywhere.

What private companies do with this data is normaly for purposes of profit. So whatever data they can sell to third parties or provide better marketing in your search results. This data is VERY valuable. Typically your private data is protected but any systems can be hacked.

When it comes to government agencies the story is different. They are collecting data, sometimes with your permission and sometimes without, for purposes of control. This could be "good" or "bad". Ranging from your DMV records to emails captured that could be opposition to a government.

The bottom line is that we have no control over this data at all. It's a permanent record that could possibly have long term consequences in our lives.

Here is what I suggest. That as individuals we have two sets of keys kinda like a safety deposit box at a bank. You have one key and the bank has the other. Both keys are needed to open your box.

So now all your private information can be stored on all of these systems and some pieces of the data can be accessed without both keys. But the very private and uniquely identifying data cannot be accessed without your half of the key. Plus I suggest that you can expire data as well or change your key every so often to make that old data inaccessible forever.

Let's put this to practice... Let's say you go to school and the school tracks your progress and even tracks all of your learning disabilities, medical records, home address, phone numbers, information on your parents/guardians, etc. This information is mostly necessary to the school administrators to ensure your safety and improve your experience in the learning process. Problem is that this data should only be accessed by certain people and not all the data needs to be accessed by each administrator.

You could limit this access by providing your half of the key when needed. This way if your teacher needs to access your home number to call your parents they need your permission. You would delegate just that access for that one time by approving a request for access. This approval process could be through your smartphone, an email, or even a pin number on the teachers computer.

The point is that you have control of your data. Who accesses it, how much data they can see, and where. Same would go for medical records, financial records, your emails, phone calls, as well as all your private information online.

To implement this will take time and cooperation. Systems will need to be in place to encrypt this data and then allow you to approve access. Gateways will need to be setup to allow systems to make these requests. Very much like a payment processing gateway for credit cards.

The mindset of who owns our data needs to change too. We own our data. Those collecting and holding it do NOT. They need our permission to use it and identify to us what it will be used for.

In the end WE have control of our data! So when the next system gets hacked and all of your personal data is in there. The hackers have worthless bits and bytes without each individuals keys to their castle if data.

Monday, June 2, 2014

Net Neutrality With Layers

There is a lot of discussion about Net Neutrality flying around with basically two sides to it. Side one is the belief that the Internet should be completely open, uncensored, and this includes bandwidth speed not just content. Side two is where companies can pay for a "fast lane" or basically premium bandwidth through the ISPs (Internet Service Providers) networks.

For example Netflix that uses up almost half of the bandwidth on some providers network has recently negotiated with Comcast and other major ISPs to have a "fast lane" or priority for their traffic. This is so that the end users will have a better experience streaming their Netflix videos. The complaint is that it's not fair to give priority to any traffic as it hurts the small companies possibly trying to compete against Netflix.

Here is where the argument begins. And the FCC is in discussions on this as well to decide if this practice is to be allowed or not. Now you have to remember that bandwidth is not free, someone has to pay for it. If not the Netflix, Amazon, and YouTube's of the world then the consumer is going to have to make up for the extra costs. To compound the issue the companies typically providing bandwidth to most peoples homes are also making money on providing phone and TV. Consumers are stepping away from these other services more and more every day. Trading home phone service for cell phones and TV for streaming services. What is left for these cable, satellite, fiber, and last mile providers is Internet with massive demand for high quality bandwidth. Whats more is everyone wants it cheap and unlimited.

I say let the free market decide what to do and let the ISPs compete for our business by providing us with the best service at the best price. If you don't like what one ISP provides then go somewhere else.

BUT you say I don't have a choice, or not much of a choice!

So then there is the discussion of treating ISPs like a utility. This would add regulations (a lot I am sure) to an industry that would more than likely have to dramatically raise prices to deal with these regulations.

Well then that is where I get into the "Layers" discussion. I have not heard anyone bring this one up but having a background in networking it makes perfect sense.

For anyone that knows what the OSI layers are the first two are Physical and Data Link. Yes there are 7 layers but for this topic I am only referring to the first two. The physical layer is the infrastructure that you can touch. Copper and fiber as well as all the equipment in between. OK before you technical people tell me that "routers and switches work on layers 1 through 4 and sometimes through 7". I understand that and I am simply referring to the "Physical Layer" as what the data passes through.

Soooooo... lets treat the physical layer as a utility. Now you can regulate the delivery of a product that uses tangible infrastructure. Imagine if there were 6 different power companies all with their own wires. Or 3 water companies with their own pipes. It would be a nightmare to build and maintain all that infrastructure. Right now the only companies that can afford to build infrastructures are very large companies with lots of money. They then allow data over their networks from all over the world and charge their customers for the connectivity to their networks.

Then lets treat the data layer as a free market that is open to whatever businesses want/need to do. With enough competition the consumer will be the winner almost every time. The companies that own the physical networks will charge a fee to get on to their network and ultimately to the consumer. This is basically how it works now except that the billing is not done directly to the data initiator but rather indirectly to a data center for example that has links to many providers.

I am a true believer in the free market place. Ultimately companies are going to do what they want and if any government agency tells them what to do they will find ways around it and/or just charge more money. ISPs will not lose money and will only pass it on to consumers.

Thursday, May 22, 2014

The Future of Animation

Have you watched an animated movie lately? Maybe Frozen, or Train your Dragon? The graphics are amazing and yes they look like cartoons but still amazing. Then there are the video games, some of these are so realistic now that you almost forget you are playing a video game.

So what is the future of all of this computer animated entertainment? The future is that it will continue to get more and more realistic to the point where we will not be able to tell apart humans filmed on camera and CG characters. I believe that the cartoons will still be cartoons otherwise the kids will not enjoy it as much. When I say kids I mean of all ages.

So what does this mean for actors and actress? In the next 10 or so years we simply may not need them. They can all be generated by computers in real-time and our entire "Matrix" world will be complete.

Ahhhh! but there is still one piece of the puzzle and that is the voices. Right now the voices for animated characters are performed by real humans. I do believe it is not that far off from this being computerized as well. Think of the cost savings from having to hire professional voice overs.

Synthesizing the voice is actually a difficult thing to do, just ask Siri (Apple) or Cortana (Microsoft) and you tell me how real those voices sound. They don't... As much as they love to say that they sound good, a small child can tell you that is a computer voice and is not real.

We are a bit of a ways off from making human voices originate completely from the bits and bytes of the computer world. There are other hurdles to overcome as well such as these voices having a personality, depth, and realistic tone and pitch. The next question is should they sound like a famous person that we can relate to or just a random pitch and tone.

So the final question is when we have computer generated humans and computer generated voices will we be licensing famous people's looks and voices or just make up new "famous virtual people".

Tuesday, May 13, 2014

Censoring the Search Index

Recently the high court of the European Union in Luxembourg decided that Google is required to "amend" search results to reflect "individuals right to be forgotten". Google has rightfully fought this all the way to the high court in the EU as they are not creating or controlling the content but simply indexing it. Google is claiming they are an "information aggregator" and not a "controller" of the information.

This amounts to censorship that hurts the Internet more than anything. Google would not even allow bad things said about it's own CEO to be removed from it's the indexes. As this would destroy the core idea that the Internet should be uncensored and uncontrolled.

I propose that Google and all other search engines simply remove their servers from any country that demands this type of censorship. If users want to utilize these search engines then they can still access them but will have to cross over political borders into "censor free" countries to do so.

Think of China. Long story short, Google went there and left. Do I need to say anymore on that?


So what is the solution. Censoring anything is bad. Text books, history, anything. If you did something bad in the past then you are going to have to live with it. Go to the content source and deal with them. Think of going to every library and forcing them to remove a newspaper article about something you did. You will fail miserably and rightfully so. There is no difference here other than its electronic.

The problem compounds that if someone can say (for example). "Hey, I don't want that inappropriate picture of me in the index, take it down!" and the index complies adding an exception. "Ok Google", now multiply that by millions of request and exceptions... Let me know how that works out for you and the hundreds of staff that you have to hire just to deal with these request. Plus how do you know if the request are legit? Why not rewrite history while we are at it.

Bottom line: Google, Bing, Yahoo, and any other reputable search engine. Don't give in to censorship EVER! If a country or government forces you to censor, pull your servers out of there. Again remember China.

Finally, as I personally have issues with authority (ask anyone that knows me). I say just ignore the court ruling. What are they going to do? Ban Google and any other non-conforming search index throughout the entire European Union?


Referenced article: http://www.forbes.com/sites/emmawoollacott/2014/05/13/europeans-can-now-force-google-to-strike-irrelevant-search-results/

Tuesday, April 29, 2014

Tracking without losing your privacy

This thought stated as a way to collect tolls and track road usage while not giving up privacy. I use Fastrack here in California and it knows when I was on a toll road and even how fast I was going.

Now I personally don't care about that data much since I'm not doing anything that anyone else would care about.

What if you did care. What if our non-trustworthy government decided that instead of a gas tax, all cars will just be tracked for usage and charged accordingly.

This may happen sooner than later with alternative fuels and electric cars. They may not pay much of or any gas tax therefore not contributing to road maintenance and construction of new roads.

So how can the government or contacted agency get this data without invading your privacy.

We need a solution that allows an individual to provide enough data to the organization that needs that data without compromising privacy. I believe this can be done by doing the following:
- Track only how many miles were traveled in an area.
- Do not track times
- Only report the month or quarters totals.
- Data collection request occurs constant but device only reports when required.

Let me explain this all...
The agency or organization only needs enough information to generate a usage billing. The device knows where you are but that device does not report that detailed data, only totals for the different areas.

Let's say you are driving through the city of San Diego on surface streets. In one month you drive 342 miles on these surface streets. That's all the city needs to know to provide you with a tax bill. The idea is instead of gas tax paying for road repair, drivers pay for where they actually drive!

The device in your car that can replace a fastrak as well only stores totals. Even if it was compromised there would be no detailed tracking data.

As you drive around there are readers like what toll roads have now that collect the data from these devices. The devices will be polled but only answer when their reporting period is due. By not answering this prevents tracking to the collection locations. When it does answer and uploads it's report it is packaged anonymously and only the collection agency will see the contents. There should be no record of what collector received the report. Again ensuring privacy of the drivers.

When you drive under these collection points your tracker can also get updates. I would foresee updates happening all the time with information on new areas, changed areas, plus as you move to an area you have never been, a new map. The trackers would only store a map large enough to cover say a few hundred square miles. If you were to go on a trip to Las Vegas then as your tracker passed through collection points it would give you more areas. All of this data transaction would be anonymous so you would not be tracked.

Each area that it's basically just a border of GPS coordinates would have a number and that number would have a corresponding mileage increment with it as you move around in that area. Once your collection time comes up all that data is uploaded and once confirmed it is received your tracker gets a clear code. For prevention of a trackers data being cleared there would need to be safeguards. In the event a tracker is lost there would be a fine, this would be enough to cover the cost of the tracker itself plus an average amount of fees that would normally be collected in a period.

Why so paranoid?
Because too much data can be used for malicious purposes. Here are several examples of what detailed tracking data could be used for:
-Let's say your seeing a sociologist and you don't want anyone to know. If you were tracked then a pattern of always going to a particular area at a particular time would give away your secret.
-You could be having an affair. Not a nice thing to do but nonetheless it's none of the government business.
-What if you frequented a gun range and you were running for political office. Maybe that is not something you want people to know.
-Just in general it's no one's business where you are or where you go. This is a free country, we don't have to be treated like communist Russia where we have to have papers to move around the country. It's plain and simple too much information that no one should have.

So as we move forward with the inevitable of having to pay for use of roads and other shared resources. Having a solution that respects our privacy while still providing enough data to pay for these resources is a win win for everyone. The added benefit is a more fair way of collecting taxes. So grandma with an older car that gets poor gas mileage but only drives to the grocery store and back vs the hybrid owner that drives 30,000 miles a year all over the country will both be treated more fairly.

I would suggest that the mileage rates be based on the weight of the vehicle as they cause more wear on the roads thus more costs to maintain.

Idea for a better Smart Phone Screen

Maybe the next big thing?

While I think smartphones are incredible and they get better all the time, one thing that is missing is real buttons you can feel. I don't mean the small power, volume, and the middle home button on the bottom. I mean real buttons that you can dial phone numbers with.

How do you do that?

How about this for an idea: Make the screen out of a material like silicone (similar to what some cookware is made out of) but stronger. Now put in vibration mechanisms that vibrate the material at a particular frequency. This will make what feels like a rubber surface feel smooth. OK so this idea already exists.

Now for the fun part. Have a layer under this screen/substrate that is really an OLED screen printed on a silicone or rubber like surface. Now this layer underneath would be like a carpet of very tiny solenoids or something maybe more solid state. These would push up on the top layer and form a pattern. These patterns could be.
- Buttons for a dial pad/dialer
- Buttons for a game
- Buttons for any application
- A simulator for map reliefs
- Easy to feel bubbles for a message interface
And so on

The buttons could depress just like a real button by the touch screen sensing your press and having the solenoids retract. I believe this could be done with the proper timing to make it feel so real that it acts like a real button being pushed. Possible to even make it feel like different materials based on the feedback pressure and the vibration frequency on that area of the screen.

With a dialer app or texting app this adds a level of safety since you can feel the screen with your fingers without even looking at it. Now when your driving (yes your not suppose to but we all do anyways) instead of looking at the screen you can just feel the 12 buttons of the number pad like on our old dumb phones to dial the call. Then press another button to dial.

With texting you can feel the balloons of the conversation and press on them to have your phone read them, etc.

Thought I would share this "thought" with everyone. Now lets see if this actually happens. I hope so! If someone or a team of someones can figure this out best of luck to you and please give credit to this post.