Sometimes I feel like I may be pushing myself along a little too quickly with computer programming. For instance, I'm in the process of learning VB.NET and have gotten to where I feel like I can start trying a few things, so I decided to look at some of the code for our custom tools here at work. They kind of make sense to me, but I don't even know how to begin figuring out what objects I need to reference. I work in Geographic Information Systems the software we use (ArcGIS) has an object model that is just insane. Here's the ArcObjects model if anyone wants to take a look: http://edndoc.esri.com/arcobjects/9.1/ArcGISDesktop/AllDesktopOMDs.pdf
All of our stuff is done in VB.NET so I was trying to find some sort of documentation to at least get me started. It was then that I realized that I don't have the background yet to even understand the jargon they use to explain things. For instance, I don't know what "hashing" is. I know these are basic concepts for anyone in computer science, but I haven't had that many CS courses. Does someone have to know all the theory to be able to program or is it just something that you can learn on your own through experience? Does it take a long time to get to the point where you can just jump right in and do exactly what you want?
I'm thinking that the learning curve for this stuff is a lot higher than I expected and that it will take awhile to get the hang of it. I feel like I should be able to just jump right in but all that's done is made me get frustrated and want to give up. Kind of the, "I'm too stupid to do this" feeling. It's overwhelming at times.