• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Programming blind spot.

I approach learning a new subject in a scatterbrained way. I tend to jump from topic to topic, absorbing a little in every area, until I have a moment of revelation where the framework for learning becomes clear. This works well for me a lot of the time but fails utterly when learning to program. I need to be more focussed and concentrate on the task.

Your code makes perfect sense to me, all I need now is the compiler!

Hey Oleron,

I think a language like BASIC may be useful to you. My pseudocode in that example is a lot like it.

You really need to be able to develop a mental model of your programming problems (I mean, tasks) in terms of the abstract computing machine.

Good luck!
 
If you're having problems trying to figure out where to start, I would recommend looking for repetitive tasks you tend to do once an hour/day/week while at work. Those that take up a significant amount of time (e.g. More than 10 minutes) are a good place to start.

For example, say you have three services that store data. Now, these three services lock access to the files they use and, for some reason or other, have no online backup capability. Being the good sysadmin you are, ;) , you dutifully stop the services once a day, copy the files to a folder called Backup and rename them in the format "MM_DD_YYYY FileType". Then you restart the services and all is well with the world.

One day you think to yourself: "Hmmmm, this repetitive task is getting boring and repetitive." "Wait, did I just say repetitive twice? I'm so silly. hehe". oh, whoops. That's what I would say to myself. :D

Here's what you would think: "Hmmm, this repetitive task doesn't change much each time I do it. How could I get the computer to do this automatically for me?"

"Let's see"

"First thing I've gotta do is stop those services."
"I can do that easily with Net Stop."

"Okay, so the first lines of my program will be this:"

Shell "Net Stop Service1"
Shell "Net Stop Service2"
Shell "Net Stop Service3"

"What's next?"
"Copy and rename the files!"

File.Copy("C:\DataFiles\Service1.wee", "C:\Backups\10_25_06 Service1.bak")

File.Copy("C:\DataFiles\Service2.wee", "C:\Backups\10_25_06 Service2.bak")

File.Copy("C:\DataFiles\Service3.wee", "C:\Backups\10_25_06 Service3.bak")

"Okie. This is looking good!"
"Whaddaya expect from a handsome, smart, intelligent person such as myself. ;)"

"Now to turn back on the services."

Shell "Net Start Service1"
Shell "Net Start Service2"
Shell "Net Start Service3"

"Awesome. Now all I have to do is change the date every time I run my program. But wait. I have to do that three times. Blah. I'm way too lazy for that."

"Let's try this..."

Dim backupDate as String = "10_25_06"

File.Copy("C:\DataFiles\Service1.wee", "C:\Backups\" & backupDate & " Service1.bak")

File.Copy("C:\DataFiles\Service2.wee", "C:\Backups\" & backupDate & " Service2.bak")

File.Copy("C:\DataFiles\Service3.wee", "C:\Backups\" & backupDate & " Service3.bak")

"Awesome. Now I only have to do it once. Hmmm.... Anyway I could make this even better? What if the computer could write in the date for me?!?!"

Dim backupDate as String = Today.ToString("MM_dd_yyyy")

"Woo hoo! Nothing to do now but relax and drink pina coladas!" :D

'-----------------
Dim backupDate as String = Today.ToString("MM_dd_yyyy")

Shell "Net Stop Service1"
Shell "Net Stop Service2"
Shell "Net Stop Service3"

File.Copy("C:\DataFiles\Service1.wee", "C:\Backups\" & backupDate & " Service1.bak")

File.Copy("C:\DataFiles\Service2.wee", "C:\Backups\" & backupDate & " Service2.bak")

File.Copy("C:\DataFiles\Service3.wee", "C:\Backups\" & backupDate & " Service3.bak")

Shell "Net Start Service1"
Shell "Net Start Service2"
Shell "Net Start Service3"
'-----------------

So here's the trick to simple scripting.

1. Find a task that you do manually.
2. Write out that EXACT same task in the simplest code you can manage.
3. Find out which parts always stay the same and which change.
4. Put the parts that change into variables (Containers for data).

Voila you have now subjugated your computer and can rule over it with a firm, yet gentle, hand. ;)

After you've got that down, you can start making more complex scripts using logical statements. The joke is that you write those in the exact same way you think as well. Programming becomes easy when you realize that all you have to do is solve the problem in code the exact same way you solve it in your head.

Next week, "Encryption Made Easy!". Find out what's so fascinating about prime numbers and how you can make them your bitch. :D
 
Last edited:
JamesM said:
The whole point of programming a computer is to get stuff done. The more a programming language hides what's happening under the hood, the better. The low level paraphenalia that C provides is a complete distraction from what's important: manipulating whatever it is you want to model at the correct level of abstraction.
But the problem is that programming language books mostly suck at teaching the abstractions that are embodied in the language. And this is particularly true for modern programming languages such as Perl, where Wald thinks the best way to learn a language is to learn a pile of syntactic hacks and their associated arbitrary semantics (witness the sigils). Apparently this sort of language learning works for many people, just as the "whole language" approach often works for learning reading, but a large number of people are just befuddled.

On the other hand, when you learn assembly language or low-level C, you're forced to learn the abstractions, because that's all there is and they are relatively simple. I'm not suggesting we all go back to assembly language, mind you.

RSLancastr said:
I've found that the best way to learn a language is to fiddle aruond with a program which has already been written
That is a good suggestion.

I've programmed for a living for 30 years, and have never learned assembler.
I've been programming for 40 years, so you just missed the boat. :D

Again, I've coded for 30 years, and I don't even know the difference between the two [binary arithmetic and boolean arithmetic].
Are you sure? You're not one of these programmers who never uses a boolean expression, are you? You don't code 'if a == true then ...", do you? :D

~~ Paul
 
Last edited:
I've been programming for 40 years, so you just missed the boat. :D
I never programmed using a patch panel either. :D

Are you sure? You're not one of these programmers who never uses a boolean expression, are you? You don't code 'if a == true then ...", do you? :D
Heyyyy!!!

Boolean logic I use all the time.

But the only time I ever did math with binary numbers was in programming school in 1976 or so. It was in a class which shuld have been titled "things you'll never have to do again."
 
I don't really need to know how to program for my job, although scripting and manipulation of the Windows API would come in useful. It's the thought that I CAN'T program, no matter how hard I try, that bothers me. I see it as an intellectual challenge.

There's your first mistake, DATKS, Design at the keyboard syndrome. Before you can write a line of code, you have to have a mental picture of what you want to do, and how you can then do it.

First of all you have to understand what the languages are capable of, and how they do it. Most tutorials will give you a very brief overview of these things, but to get to the stage of being able to make any sort of real application takes at least a year of study to get to know enough to be able to do something meaningful. Of course there are the freaks out there who just take a look and go, but most of us are humans, and take a lot longer to get to the stage where you can achieve something significant. Toy applications that do something simple can be built quite quickly, but that's where you have to start.

Once you have the basic understanding of what tools the language gives you to get the job done, you can then work out how you want to use them to to take the data in you are working on, and where it is going to end up. If you are going to use a real engineering approach, you will document all the inputs, ouputs, data stores, and how the data moves between them, how it is manipulated, and then write the code to do it all.

However, to answer your basic question, first of all, spend some time learning one language, using one development tool and environment, then try to build some simple applications and see how the bits all connect with each other.

For example, you aren't going to be able to just sit down and write a game engine, database, operating system, or anything like that.

Now, what do you actually want to do with your programming knowledge? What did you see yourself being able to achieve with it? That would be the first place to start.
 
Oleron:

I've found that the best way to learn a language is to fiddle aruond with a program which has already been written:

1. Find the code for an existing program (hopefully a relatively small and simple one).
2. Study the code a bit to try and understand a bit about how it works.
3. Try to make a simple modifications to it.
4. Try to make another simple modifications to it.

... and you're off!

I've programmed for a living for 30 years, and have never learned assembler. Not only is it not necessary, but could quite well scare off a newbie.

Once, about fifteen years ago, I decided I should learn assembler, and bought a book on beginning assembler.

After going through four chapters before I could write a program which simply displayed an asterisk on the screen, I said "life is too short for this crap."

Maybe learning assembler is a young person's thing. Only they would have the time for it...

Again, I've coded for 30 years, and I don't even know the difference between the two.


I'm going to second the suggestion to find an existing sample program that is *very* simple, and tinker with it. As your skills grow, you can add more sophistication. Choose a language with buckets of sample code, as per the first point above, so you can see lots of simple examples.

The second piece of the puzzle is to find a resource book that has a list of the functions or whatever for the langage in question, and keep it beside the mouse and expect to use if often to 'look things up'. In other words: learn by reverse-engineering examples with a reference book.

I would also choose a language that you can use at the drop of a hat, which means that you can use it on your own computer, or with a quick login. This is why I use perl myself (My machines are linux or Macs) but the above suggestions for that sophisticated dos shell sound appropriate for a Windows environment. The other reason I prefer perl as a learning language is that it's a scripting language. This means "instant gratification" - ie: no futzing with a compile. Edit, save, run... edit, save, run... I think a learner develops faster.

Lastly, have a simple project in mind, and when you think you've got to a comfortable point in your ability to tweak existing code, try to adapt one to your project.
 
Hmmm, I'm having a look at this Powershell stuff that RyanRoberts suggested. Looks like it might do the business.

The task I have chosen is to design a simple software/hardware audit tool for a windows environment. It would be nice if the tool could populate a database table with this info from all my network clients but I might be over-reaching myself there! Generation of a text file from each client will suffice for now.

I know I could buy such a tool for a few quid but I would learn nothing that way.
 
Oleron said:
jm01 - Isn't C or Assembly fiendishly difficult to learn? I always kinda avoided these kind of languages.

With regards to Assembly, I accept that it is quite a difficult language to learn. The reason I mentioned that is so that you will get a fair idea of how the CPU works.

Coming from the old school. Binary arithmetic is say 1001 + 0100. It is simple arithmetic only in binary. That would also include bit shifting. Boolean arithmetic is "or", "xor","and",.... Boolean logic is "if condition (or condition and condition) then statement else statement". Mind you, the industry could have invented new terms for those.

Knowledge of the CPU is very important when the machine throws up, you can examine/understand the memory dump. But of course, we have tools to do that for us. But what happens when there is a problem with the lowest layer (the machine code), what do we do.

The advantage in C is that there are only a few "reserve" words you have to understand. Most of the time, you'll just have to use linkable codes.

I also used Delphi for some projects that I did. What I like about Delphi (Pascal) is its strong typing. Also, it gives you a clear view of how messaging works. You can create your own message and receive/send it.

Some people might not agree with me. VB encourages bad programming habits. You can use a variable without first defining it. I am also againts the use of variants if it can be avoided.

I do not agree in the way the industry is heading. We create layers upon layers of code. I know a lot of programmers who doesn't have any idea how the lowest layer works (machine code). I am afraid that in the future, we will end up with codes that only a few can maintain.

I use to create operating system components for the mainframe. The maximum memory of 16M was sufficient to service thousands of users around the world. Now the hardware has to keep up with the software. But thats just me.

Anyway, before you can code, you'll need to define exactly what you want to achieve. If you don't have a target, you'll just end up staring at the screen.
 
Another old fart chiming in with tuppence worth of advice (started in 1969, Fortran IV and Algol 60 at school in Scotland). I did Computer Science at university and have done huge amounts in assembly language professionally (which I absolutely thoroughly enjoyed - developed complete database system, written in our own dialect of APL, the interpreter/compiler for which we wrote in ca. 3 million lines of IBM Assembler).

Here's my suggestion along similar lines to those that have recommended "tinkering" with existing programs.

Get a programmer colleague, whom you trust as being very good, to give you a reasonably large program (i.e. not completely Mickey Mouse), intentionally introduce a number of bugs, and give you the program to debug. This way, you'll need to start understanding pretty much every statement in the program in order to judge whether it's correct or not.

The language is not too critical; obviously, though, I don't think it would be appropriate to start with Algol 60. Ideally the colleague should choose a "cleanly"-coded program; you don't want to pick up bad habits straight away - you've plenty of time for them later ;)

At the end of the exercise, you've just ended up with your first template which you can then re-use and re-cycle.
 
Oleron:
I've found that the best way to learn a language is to fiddle aruond with a program which has already been written:

Second that..

Most of my projects start off with me copying some of the old code I had and then modifying it. If I had to start from scratch I would be thumbing through my text books most of the time.

I see learning a computer language very much like learning to speak a foreign language. You can read through as many text-books as you like, but being exposed to a native speaker and trying to convey your intentions will always be difficult.
The only way is to find a problem you want to solve and try to build up a solution yourself. You may need to copy existing code and adapt it, but eventually you will have enough knowledge to build code yourself.

This of course does not tackle issues like debugging and philosophies like bottom-up, top-down but these can come later.

I think the problem today is that most of the popular languages push you directly into event-driven object-oriented programming, which is not always intuitive for a first-time programmer.
 
pauldmin said:
I think the problem today is that most of the popular languages push you directly into event-driven object-oriented programming, which is not always intuitive for a first-time programmer.

I do agree. The event driven technology hides the fact that in reality, the machine works in a procedural manner. For a new person learning how to program, it becomes confusing when their routines are suddenly triggered from nowhere.
 
Do you want to program or do you want to be a programmer?

The two things are not the same. I can cook, but I'm not a chef.

If you just want to program things for your own enjoyment then I would get something like VisualBasic.net Express. It's free and there is a lot of example code out there.

If you want to be a professional programmer then take classes.

And as logical muse said, you will become a much better programmer if you really understand how a computer works. You may never use assembler in your professional life, but understanding it will help you.

And just so you know 60% of all programmers are not very good at it. This is an observation I've made over 20+ years of writing software, working in all kinds of companies from Fortune 100 to small start-ups to dealing with outsource "talent".

The whole point of professional programming is NOT just to get things done. It is to write clear, concise, efficient, maintainable and usable code. Making something work is only a part of it.
 
And just so you know 60% of all programmers are not very good at it. This is an observation I've made over 20+ years of writing software, working in all kinds of companies from Fortune 100 to small start-ups to dealing with outsource "talent".
I would say it's more like %80.
heh, but %78.645 of all statistics are made up.
 
jm01 - Isn't C or Assembly fiendishly difficult to learn? I always kinda avoided these kind of languages.

Assembly really is not difficult to learn. There are only about 100 atomic operations assembly can do, and they are for the most part context insensitive. It is, however, fiendishly difficult to get much done in assembly. It depends on how deep, and in what direction, your curiousity is. If you really want to know "how does software accomplish that?" you will need to know Assembly.

The type of work I do requires me to be fairly comfortable with assembly, however for 99% of the work out there, it is not necessary. Nowadays there is very little professional justification for being an ace assembly coder. If you do decide to start there (because you're just curious), make sure you start with good old MS-DOS (not Linux/Bash, not windows, not mac).
 
<sarcasm> But Steve Gibson says EVERYTHING should be written in assembly and everyone knows he's a coding god! </sarcasm>

lol

The thing about C (and C++) that tends to throw most people for a loop are pointers. Especially pointers to pointers. But once you get it, you get it.

As for assembly. None of my guys know assembly and I don't expect them to ever learn it for work. I learned it years, and years ago, but only employ it these days when I look at dissasembly and that's mostly for bugs. For optimization it's mostly useless when it comes to modern windows programming as you never know what the JITter will end up doing on a different system.....
 
Assembly really is not difficult to learn. There are only about 100 atomic operations assembly can do, and they are for the most part context insensitive.

I don't believe Assembler to be more difficult to learn, but certainly different from high level languages, and there's no justification for a beginner to try and learn both at the same time. The size of the instruction set is not the issue: RISC machines are in fact more difficult for a human (but easier for a compiler) than their CISC predecessors. The only Assembler I learnt was the Motorola 68k, since that was the processor my first computers used (Amiga 1000, Macintosh Classic). The only real assembler programs I've written were for my Texas Instruments TI-92 pocket calculator, fort which I simply had no choice. I've never bothered with x86 assembler, as I've never felt the need.
 

Back
Top Bottom