Registered users: Bing [Bot], Google [Bot], Google Adsense [Bot]
 
User Information


Site Menu


Sponsored


Who is online
In total there are 57 users online :: 3 registered, 0 hidden and 54 guests Most users ever online was 218 on Wed Dec 07, 2016 6:58 pm Registered users: Bing [Bot], Google [Bot], Google Adsense [Bot] based on users active over the past 5 minutes 

The Team
Administrators
Global Moderators


Sponsored


From the desk of useless projects......comes the Hexclock!
A hexclock is the implementation of a system of time that works based on hexidecimal numbers. Instead of the arbitrary 24x60x60 second method, hexidecimal time divides the day into 65,536 individual "hexseconds". 24h is exactly 0x10000. So without further ado, here are some pics: What sucks is that the "8" being displayed is the only thing that can be displayed there. When I was transferring the board into that case, the chip controlling that digit got fucked up, and I won't be able to get a new one for a while. Environmentalists would also hate this, it draws about 450mA since the display is "live". The time setting feature is the collest part, imo. It is set by a computer, through the parallel port
"If at first you dont succeed, then skydiving is not for you"  Darwin Awards
I don't get it how do you tell the hexadecimal time with only the three digit display
any damage pics?
So your saying a day is 10,000 seconds long?
When life gives you lemons...throw them back they suck!
so this system is base 10 system multiplied by a thousand?
http://www.spudfiles.com/forums/viewtop ... Revolution in the absolute is not initiated by swords, guns, and bullets, but by words
The New RobesPierre
No, 65,536 hexseconds.
A day lasts about 60*60*24=86 400 seconds It also lasts 65,536 hexoseconds. That means that one hexosecond will last: 86400 / 65,536 = 1.318 seconds. Because its hexadecimal, you count like this: 1, 2, 3, 4, 5, 6, 7, 8, 9, A, B, C, D, E, F, G, 10, 11, 12, ....., 19, 1A, 1B ect 10000 hexadecimal = 65,536 in decimal
whats the point of letters?
http://www.spudfiles.com/forums/viewtop ... Revolution in the absolute is not initiated by swords, guns, and bullets, but by words
The New RobesPierre
@magnum
Because we have no digit characters except from 0 to 9. Computers usually count hexadecimal. It probably has something to do with the binairy code (01010101). A set of 4 bits (one byte) has 2^4 possibilities, wich is 16. The G (the last digit in hexadecimal) is equal to 16 in decimal. 10 is not an outcome of an 2^x 2^3=8 means that a set of 3 bits to define one byte would not work, since then it would need to use an octagonal system. And if your using 4 sets, why not use all 16 of it? @jrrdw Computers, and nerds ofcourse. Damn I didnt know I knew so much of that hex stuff Im almost looking smart
Hexadecimal is a base 16 counting system. Our normal system is base ten, so we only have 10 different symbols to represent numbers. So they use the first 6 letters of the alphabet to represent numbers since the symbols and their order are already known by many people. http://en.wikipedia.org/wiki/Hexadecimal Edit: psycix  8 bits is a byte, 4 is a nibble. Also hex only goes to F
<a href="http://gbcannon.com" target="_blank"><img src="http://gbcannon.com/pics/misc/pixel.png" border="0"></a>latest update  debut of the cardapult
The person who invented this must have had nothing but time on his hands......
When life gives you lemons...throw them back they suck!
Wow, so many replies, and replies to replies
The hexidecimal numbering system is simply a way of writing binary code in shorthand. Since a 4bit code can be anywhere from 015, the ordinary 09 numbering system was rather cumbersome. Therefore, hex was invented. Now the hexclock has a few advantages over a regular clock: It doesn't need arbitrary conversion factors like 24 or 60, just move the decimal point to convert between days, minutes, hours. Less of the time is counted in the seconds (which aren't displayed on mine, just like on a regular clock), so the resolution is actually higher. It looks cool. *Note: "0x" preceeding a number means that number is in hex. 0x10000(hex) is the equivalent of 65,536(decimal).
"If at first you dont succeed, then skydiving is not for you"  Darwin Awards
Darn! I knew I wasnt all correct the whole time. I should have added the "Correct me if im wrong" disclamer.
The hexidecimal number system wasn't really invented per se, it just is. It's a natural extension of the binary, 1's and 0's system. You can extend binary numbers out far enough to represent larger numbers, say to 11111111 (which equals 255 decimal, FF hex). Instead of 1s 10s 100s 1000s, etc, you have 1s, 2s, 4s, 8s, 16s, 32s, 64s 128s. Add all these numbers and they add up to 255.
Thus here's where hex numbers come into play. You have 16 different numbers you can represent in 4 bits. If you stopped at 10, you would be wasting roughly 1/3 of your number or address possibilities per nibble. So the following: 0000  0 0001  1 0010  2 0011  3 0100  4 0101  5 0110  6 0111  7 1000  8 1001  9 1010  A / Decimal would start over here being 0001 0000 (10) wasting 1011  B / these extra "address" locations...had to name the "extra 1100  C / numbers" something, just went with A  F. 1101  D 1110  E 1111  F Keep in mind, numbers in a computer system represent memory or disk addresses not just data. Did I lose a few people here.... Re. Hexclock, it is also just as arbitrary as the 24/60/60 second method. To be accurate within the period of a solar day, the clock is still dependant on the time accuracy of the increment counter. Also, a hexclock really isn't very useful in human interface applications, more so as a potential timing source for machinery or software. Still, it could get unnecessarily confusing when converting to regular time for us humans. Nice academic project however! and nice tweek to the environazis...
Anal attack.... Computers don't usually count in hex. They count in binary. Hex just sucks 4 binary digits up into a single digit. There are relatively few computers that actually use 4 bits for anything. The smallest common word size in modern computers is 8bits. Most modern computers don't even use 8bits as a logical unit anymore, usually it is at least 16 or 32 bits.
 
Who is onlineRegistered users: Bing [Bot], Google [Bot], Google Adsense [Bot] 
