Look at a PBP's compiled/assembled program, then take a look at that same PBP source code. You'll see that they are very similar in size and function.
I have found that there is very little overhead when dealing with PBP and as far as speed goes, that goes with overhead. If your functions in PBP are complex, then so will the final assembly code. That goes for any language. Even if you're programming in straight assembly. If you've got something complicated to do, then it's going to take complicated steps to get complicated work done.
Now then, if this person that told you that PBP was slower and bloated with overhead, but this person was actually talking about the BasicStamp (and it's happened many times before), then yes, the BasicStamp is a lot slower, because it is an interpreted language, and does have a lot of memory and CPU cycle overhead because it is interpreted on the fly. PBP is pre-compiled before being burned into the chip.
It's a lot like learning how to do something the first time.
Either you can read a book and memorize each step ahead of time so you can do the task without stopping (PBP)......or you can put the book 20ft away from you on the other side of the room and refer to it for each step of the task (BasicStamp).
And as far as learning C goes....to each their own. I don't see (C) any advantage in using C with the PICs, maybe the larger dsPICs and maybe the 24F series, but not in the 10F/12F/16F/18F range. A program written for PBP and a similar functioning program written in C will most likely compile down to the same functionality, speed, code size (all other things being equal), and probably the exact same code itself.
Bookmarks