That is almost impossible to say when you don't even give us a model number for the display in question so we can look at it, how it works, what controller it has etc.
But lets say it's a monochrome, 1bit per pixel, serial interface. You have 15360 pixels on the screen and you want to update the screen 100 (!) times per second, that's an average bitrate of around 1.5M or 192000bytes per second - not counting the overhead of any control bytes.

The MSSP module can send data at a rate FOsc/4 so at 32MHz the maximum bitrate is 8M but that doesn't give you ANY room to actually fetch the data and feed it to the MSSP module. If you run it at FOsc/16 it'll give you 2Mbit per second and you'll have a couple of instructions left over to do something with while the MSSP module sends the byte.

Now, why on earth do you need 100 updates per second? Just because you have a timer resolving 1/100seconds doesn't mean you have to update the display that often. If you update it 20 times per second the average bit rate drops to 300kbit which I think is more manageable. But then you have to generate the bitmap data....

Obviously, if this is an intelligent display with built in fonts etc it's a different story but how do we know that when you don't give us any details....

/Henrik.