From: owner-eda-thoughts-digest@smoe.org (eda-thoughts-digest) To: eda-thoughts-digest@smoe.org Subject: eda-thoughts-digest V1 #148 Reply-To: eda-thoughts@smoe.org Sender: owner-eda-thoughts-digest@smoe.org Errors-To: owner-eda-thoughts-digest@smoe.org Precedence: bulk eda-thoughts-digest Monday, August 24 1998 Volume 01 : Number 148 Today's Subjects: ----------------- Re: ET: Re: eda-thoughts-digest V1 #130 ["Seth D. Fulmer" Subject: Re: ET: Re: eda-thoughts-digest V1 #130 At 12:35 PM 8/24/98 -0400, Kevin Pease wrote: > Insignificant? Tell that to the baby's parents. :) My parents still >remember my first step, and I think there actually may be a picture of that >exciting event somewhere around the house. I consider my first step insignificant and I would consider my baby's first steps insignificant as well as any baby. I automatically know they're going to do it so it's no big accomplishment. Now, if the baby has difficulty doing it(more than normal) and they accomplish it, then I'll give them credit, but given the fact that it's a common day occurence, it's no big deal. > I disagree. Maybe the baby hasn't walked ten feet, but it has still >taken a step, and it is now capable of walking that 10 feet. What's more >important (or maybe I should say harder) in your life - learning to take >that first step, or taking the 70 trillion steps that will follow it? I bet >you don't even have to think about walking to do it these days, I know I >don't... it's automatic. Try telling that to a baby who hasn't learned to >walk yet, though. The 70 trillion steps that will follow it is harder in that it's a tedious repetitive task. I used to not think about walking, but for the past couple of years I've been on a crusade to give a computer/robotic being emotions and I've been analyzing the most basic of my motor and social functions to the lowest level. Walking is not that simple a task and actually takes on quite a few low level motor functions if you count every unique movement of every muscle. The mind has just automated it because it's done it so many times. > No, that's where you're wrong. The fact that you failed at that goal >does not devalue the worth of your other accomplishments. I worked my ass >off in college, with the hopes of graduating with high honors. I didn't get >it... but I *did* graduate with honors. Does the fact that I graduated >"only" with honors mean all that effort I put into it was worthless? Not in >the least. I still know I worked my butt off, and I also know that my study >habits maybe could stand to be improved. I still feel a lot of pride when I Oh, I would know that I acquired honors and that my study habits needed to be improved but the fact that I failed at my task proves that I am unfit to try that same task again. I'd give anything to live my life over from September 12th, 1977. There are soooo many things I'd do over, including actually making friends from the start rather than trying to destroy others' friendships. >see that little piece of paper hanging on my wall that says "Bachelor of >Science, Biotechnology." I know what went into it, and I know how hard I >worked. > My "goal" was graduating with high honors... I didn't make that, but I >can certainly be proud of what I *did* accomplish towards that goal, and >I've learned where I can use some improvement, so when I go back to grad But your work was for nil. Your work was in pursuit of your goal. Given that you did not fulfill your goal, you wasted your effort. If you pay a carpenter for a deck that goes over your house out to 10 feet and it only goes 9.5 feet, you've wasted your money(at least the money you paid for that last .5 foot). >school, maybe I can improve my study habits a little, and do a little better >this time around. But if I accept what I did as acceptable, then the next time I do it, I will do it exactly the same way I did it the first time. On the National Latin exam if I accept getting 1 wrong then the next time I take it I'll get 1 wrong again. A prime example of this is my Psychology 101 class. I took it last term and got a C on my midterm, even after all my studying. I dropped it because I couldn't handle the work with everything else and took it again this term. Because I was satisfied with what I knew already and how I took the test(I was happy with my effort on the test)...I put the same effort into the test the 2nd time around and got a C again. If I wouldn't have been so proud of it, I'd have worked harder the 2nd time and possibly gotten an A. > Check that paradigm at the door, please... :) Data is the reason the >software exists... incorrect, faulty, or inaccessible data is useless. >What's the point of having the Webster's complete Dictionary on a CD if I >can't access it? It's useless, albeit fun to throw around. The computers >won't crash, but the data will become corrupted, inaccurate, or just plain >inaccessible... This is one reason Object-Oriented design is so big these >days. It treats data as the most important part of the program (as it >really should be)... > > Bad data = useless programs. Data is only as bad as it's interpretation. If I tell you that 1 + 1 is 11...That is both correct and incorrect. In almost any numerical interpretation, it's incorrect. If you take the 1 and 1 as symbols and just put them together(concatenation)...It's very true. Those 2 digits stored in a record of a database can be interpreted any way the programmer wants. > The problem is "simply" a problem of software. It's not a hardware >error, the software with this error just cuts off the first two numbers of >the year. The hardware is fine, the software is faulty. But it's not a simply a software error in the case of the older PCs and some add-on cards to computers as I found out at my last job. The actual computer that I was using needed an extra chip so that it would be Y2K compliant or else it would revert the date after 11/31/99 @ 23:59:59. > Creating new software to access & use old databases is easier said than >done. It's easier to fix it most of the time than recreate it. And really, >for a lot of these systems, to redesign & start over from scratch would take >well past midnight of Dec. 31 1999. Start-to-finish design of software on >an enterprise level is a long process. But it's not difficult. It's called reverse engineering. My friends(2) and I recreated Window95(at least the front-end...not the guts) in 4 hours. The other parts would be more difficult for us given that we don't have the guts of the OS code, but still...it's not that hard to create software from scratch if you basically imitate the behavior(reverse engineer). :) > Like I said, it's not a hardware problem... the chips are fine. The >software that interprets the data which the chips track is faulty. Like I said before in this post...there are chips that will actually crash because they have set in their ROM programming error checking procedures and the chips will just not function any more. > We wouldn't? Seems to me that we're doing this more and more as the >days go by... computers & robotic devices are becoming more and more common >and pervasive in everyday life. We've abdicated a lot of responsibility for >day-to-day work to mechanical systems. If you're referring to calculators and computers....I agree that a lot of humans are annoyingly addicted to letting calculators and computers do their work for them(math mostly)...Grocery store attendants can't add 5 numbers together by themselves is pretty pathetic...We learn in college how to do math(not by hand) but on the computer. I brought up to the professor that I want to know how to do _____'s algorithm by hand and not just using a program like Maple, MatLab or Mathematica, or a simple scientific programmable graphical calculator. The DOD lets the defense of this country lie in the hands of a computer with an extremely well tested program and I agree we should let it lie in the skill of a person to target things and press a button, but they're not "running" the country as in with leadership like the president(is supposed to be) doing. > I don't think I understand your example. We get stupid, frivolous laws >passed every day, and we're supposed to make it easier to get laws passed? >:) I mean, do we really *need* a National Pork Month? I never heard of National Pork month...but those kinds of laws don't do anything. They're like that extra procedure one puts in a program that you could easily do without but you put in to make the program look good to other programmers :) > Um... Seth, that is the way it works now. The chip keeps track of the >time, and the software interprets the data that chip tracks. The problem On a lot of systems that's the way...However, there are boards and chips that the hardware interprets the data...some of the hardware is BIOS chips. >The software just took the last two digits. If you're coding a database >with a need for a lot of year variables to be recorded, cutting off those >first two digits can, over a database of 1,000,000 entries, save a lot of >disk space & processing power. When the software needs to access the With a database, they can sort the entries appropriately according to century. Visual dBase 5.6 has a setting called CENTURY which determines the base date. If you set the base date to 1950, then a date of 47 means 2047. There's another setting(I forget the actual variable) which actually records the date as long or short and if you have 47 in a file, it uses CENTURY. If you have 1947, it uses that date instead. I hope I didn't scare anyone...If there's something you don't understand, let me know :) Seth D. Fulmer mailto:kaosking@voicenet.com ------------------------------ Date: Mon, 24 Aug 1998 09:59:18 -0700 From: lil.goalie31@juno.com Subject: ET: a poem Hey angels. Here is just another poem that I wrote so I just thought I'd share it with you all..... "The Worst pain of All" by: Tara, the ice angel Empty words fill a page Written with undieing rage. A spinning head decieving me Is all I have, is all I'll be. No one here to share the pain Brightness turns to falling rain. Cascading down, this bloody art Worst pain of all, still piercing my heart. As you can see, I am not much of a poet but oh well. Talk to you soon! - -Tara, the ice angel _____________________________________________________________________ You don't need to buy Internet access to use free Internet e-mail. Get completely free e-mail from Juno at http://www.juno.com Or call Juno at (800) 654-JUNO [654-5866] ------------------------------ Date: Mon, 24 Aug 1998 15:20:52 -0400 From: "Kevin Pease" Subject: Re: ET: Re: eda-thoughts-digest V1 #130 >> Seth D. Fulmer writes: >I consider my first step insignificant and I would consider my baby's first >steps insignificant as well as any baby. I automatically know they're >going to do it so it's no big accomplishment. Now, if the baby has >difficulty doing it(more than normal) and they accomplish it, then I'll >give them credit, but given the fact that it's a common day occurrence, it's >no big deal. If you compare it to, say, curing cancer, then yeah, it's no big deal. But when your own flesh & blood gets up, starts walking around... that's pretty amazing, and pretty wonderful. Shit, man, I don't have any kids, but I do have a couple of little cousins I've watched grow up over the past few years... they're not even mine, and I get excited when I watch them do things... If you ever have kids, I really really hope that you'll change your attitude. Otherwise, your kids will *never* be good enough to please you, and they're going to grow up resenting you and hating you. At some point, you have to be willing to say, "Hey, this is what I'd like to see you do, but you gave it your best effort, good job," whether "you" is yourself, or somebody else. As I said before, If you keep upping the bar on yourself (or on other people), you're never going to be satisfied with them, or yourself. >I used to not think about walking, but for the past >couple of years I've been on a crusade to give a computer/robotic being >emotions and I've been analyzing the most basic of my motor and social >functions to the lowest level. You can crusade all you want to give a computer emotions, but you're going to fail. A machine cannot "feel" emotions. You can program it to respond certain ways to certain stimuli, but you cannot imbue that machine with feelings. It will never feel anything, and can respond to no more than it's programmer can code it to respond to... if you have to program something to act a certain way, then it's not actually "feeling" an emotion. You can probably make a computer do a pretty nice mock-up of feelings, but it's not going to actually "feel" them. >Walking is not that simple a task and actually takes on quite a few low level >motor functions if you count every unique movement of every muscle. The >mind has just automated it because it's done it so many times. But wait, up above you just said that taking a first step is no big deal, and that accomplishing that is not worthy of notice. If it's not a simple task, shouldn't accomplishing that first step be newsworthy? :) >Oh, I would know that I acquired honors and that my study habits needed to >be improved but the fact that I failed at my task proves that I am unfit to >try that same task again. WHAT? That's a remarkably defeatist attitude. Ever hear the sayings, "Practice makes perfect?", or "If at first you don't succeed, try, try again?" What a ridiculous cop-out it is to say, "I didn't make it the first time, so I'm not worthy to try again." You learn from your mistakes, you grow as a person, and you keep on trying if the goal is that important to you. >I'd give anything to live my life over from >September 12th, 1977. There are soooo many things I'd do over, including >actually making friends from the start rather than trying to destroy >others' friendships. Yeah, if I only knew back then what I know now, my life would be totally different, too. But, you can't live your life over - what's done is done. The only thing you can change in your life is *right now* - you can't go back to kindergarten and make new friends, but you can make new friends right now. You can't go back and take back something you said to someone 10 years ago, but you can stop yourself from saying it to someone else now, or in the future. >But your work was for nil. Your work was in pursuit of your goal. Given >that you did not fulfill your goal, you wasted your effort. If you pay a >carpenter for a deck that goes over your house out to 10 feet and it only >goes 9.5 feet, you've wasted your money(at least the money you paid for >that last .5 foot). You know, I find this more than a little insulting. Don't apply your belief system to me, thank you very much. If you want to think that a less-than-perfect record is useless, that's your deal. My work was most certainly not for nothing... I have a good job as a result of that degree, and I have a firm foundation to build on when I go back to graduate school in a year or two. Your analogy, quite frankly, sucks, because I wasn't paying a carpenter to build a deck, I was paying the school & the teachers to educate me, and allow me to educate myself. If I didn't make it that little extra, that doesn't mean all my time was wasted, and all my money was wasted. It means I didn't make it that little extra, period. I consider college to be time well spent, and money well spent. If you want to beat yourself up over not being perfect, feel free. But don't beat me up, or expect me to beat myself up, because I didn't live up to your expectations. >But if I accept what I did as acceptable, then the next time I do it, I >will do it exactly the same way I did it the first time. Um. HUH? This logic seems terribly flawed. If you're doing it again, you've obviously decided what you did the first time was not acceptable. The point is this - when you do something, even if you don't do it as well as you had hoped, you have to learn to accept that, good or bad, and learn from the experience. You can do it again and try for results that are more acceptable... I'm not saying to be happy with mediocrity or anything to that effect... what I am saying is, accept your failures, and learn from them. Don't beat yourself up over what's already been done, and don't use one failure as an excuse to not try ever again. >On the National >Latin exam if I accept getting 1 wrong then the next time I take it I'll >get 1 wrong again. This, again, is extremely flawed. When I took my SATs, I was pleased with my first result, but they certainly weren't perfect. I decided to take them again, and did a lot better... why? Because I knew what to expect, to some degree, and I wanted to do better the second time around. I wouldn't be surprised if most people who take any sort of test do better the second time around, for the same reasons. >A prime example of this is my Psychology 101 class. I >took it last term and got a C on my midterm, even after all my studying. I >dropped it because I couldn't handle the work with everything else and took >it again this term. Because I was satisfied with what I knew already and >how I took the test(I was happy with my effort on the test)...I put the >same effort into the test the 2nd time around and got a C again. If I >wouldn't have been so proud of it, I'd have worked harder the 2nd time and >possibly gotten an A. Or, it's entirely possible that Psychology 101 just wasn't your subject. I killed myself studying for Probability, and trying to do the homework for that class... I still didn't pass. I understood the concepts, but actually putting them into practice in homework & test examples was difficult. (To me, at least...) this doesn't mean I'm a complete idiot, it means that probability is not something my mind can wrap around easily. If I took it again, I probably wouldn't be able to do much better, no matter how much effort I put into it. I put in all sorts of time working on the homework & studying for exams, and still failed. Instead of beating myself up over a class I didn't need, I just went on to do other stuff. :) >Data is only as bad as it's interpretation. [...] Those 2 digits >stored in a record of a database can be interpreted any way the >programmer wants. Obviously, they can be. Should they be? No. They should be interpreted the way the customer wants them to be interpreted. If I'm a programmer, and decide I want my program to access a database and return those two digits, but reversed, then when my customer comes in and tries to use my software, the customer will end up with useless or incorrect data, because he's not going to understand what's been done to the data. As I said before... bad data = useless software. Nobody's going to use your product if you don't make it do what they expect. >But it's not difficult. It's called reverse engineering. My friends(2) >and I recreated Window 95(at least the front-end...not the guts) in 4 hours. >The other parts would be more difficult for us given that we don't have >the guts of the OS code, but still...it's not that hard to create software >from scratch if you basically imitate the behavior(reverse engineer). :) Um, yeah, it really is hard to create software from scratch if you basically imitate the behavior. Try working as a software engineer for a while before you decide how easy it is. Yeah, Making a mockup of the Windows 95 front end is real simple, provided you have Visual C++ or Visual Basic, and already have Windows 95 or NT running on your computer. Windows 95, however, is much more than a graphical interface, it's a whole operating system. You could attempt to reverse engineer it, but that attempt would require a lot of time, effort, and annoyance on your part. I can probably cobble together a cute little front end that looks just like Windows 95 in a couple hours, too... but it's going to be totally functionless, and then all I have is a cute front end, not a working, reverse-engineered copy of Windows 95. To imitate something in detail, you still have to design, code, test, and revise your software, which takes a LONG time. Proper start to finish engineering of any piece of software is measured in months & financial quarters... not hours. >If you're referring to calculators and computers....I agree that a lot of >humans are annoyingly addicted to letting calculators and computers do >their work for them(math mostly)... [...] The DOD lets the defense of this >country lie in the hands of a computer with an extremely well tested >program and I agree we should let it lie in the skill of a person to target >things and press a button, but they're not "running" the country as in with >leadership like the president(is supposed to be) doing. And more and more, the Department of Defense (DOD) and the government in general is letting automation take over. Giving machines a lot of our own responsibilities does, in a lot of cases, mean we're at the mercy of the machines. Ever try to get an ATM to give you money when you've exceeded your daily withdrawal limit? I can get around that limit at least one way that I know of, by going to a grocery store nearby and writing a check for an amount higher than the total of my purchases, because it's a person giving me that money. The ATM sure as heck won't, though, because it's been programmed not to for security reasons. If I can't find a place like that nearby, I'm up the creek, because that computer won't give me my money, no matter how hard I try to convince it to. It's those little day-to-day tyrannies that add up... Do I think we're going to someday be ruled by computers? No. Do I think that we're becoming more and more reliant on them? Most definitely. Do I think in some cases that's a bad thing? Probably. >I never heard of National Pork month...but those kinds of laws don't do >anything. They're like that extra procedure one puts in a program that you >could easily do without but you put in to make the program look good to >other programmers :) No, those laws are useless... that's not the same as not doing anything. They do something: spend tax money, primarily. Your point was that the "hardware" of government makes it very difficult to change any of the "software" of the laws. My counterpoint is that frivolous laws like these are passed every day. We have a National week, month, or day for just about everything. Do we really want to make it easier to change the "software" in government? I don't think so. >With a database, they can sort the entries appropriately according to >century. Visual dBase 5.6 has a setting called CENTURY which determines >the base date. If you set the base date to 1950, then a date of 47 means >2047. There's another setting(I forget the actual variable) which actually >records the date as long or short and if you have 47 in a file, it uses >CENTURY. If you have 1947, it uses that date instead. Thanks for the Database lesson, but my point still stands. Back when they started programming on these lower-powered computers ("640k of memory should be enough for ANYBODY..." or something to that effect, I believe it was Bill Gates who said that...), they were looking for any reason & chance to cut a little bit of storage space from a file, and reduce the amount of time it would take to process & access that file, as well as cut down on the memory usage. Now that we have desktops which come standard with 64 or 128 megs of RAM, plus a Virtual Memory file courtesy of windows, that's not as big a deal. But when you're trying to write 1,000,000 entries of a database to an 80 megabyte disk drive, well... you need to save space. THAT is the reasoning that led to the emergence of the Y2K problem. (By the way, what's the release date on Visual dBase 5.6? I'm guessing it's not 1968. They implemented that function probably as a result of being made aware of the year 2000 bug.) Kevin - ---------- Kevin Pease kbpease@boston.crosswinds.net (ICQ UIN: 3106063) (AOL Instant Messenger: kbpease) http://www.crosswinds.net/boston/~kbpease "I feel like a quote out of context, withholding the rest, So I can be for you what you want to see; I got the gestures, sounds, I got the timing down, It's uncanny, yeah you'd think it was me; Do you think I should take a class to lose my southern accent? Did I make me up, or make a face 'til it stuck? I do the best imitation of myself..." -----(Ben Folds Five)----- ------------------------------ End of eda-thoughts-digest V1 #148 **********************************