Eh... No, it does not support streaming responses.
I know this because I wish it did. You can approximate streaming responses by using progress notifications. If you want something like the LLM partial response streaming, you'll have to extend MCP with custom capabilities flags. It's totally possible to extend it in this way, but then it's non standard.
Perhaps you are alluding to the fact that it's bidirectional protocol (by spec at least).
That's transport and message passing. The response isn't streamed. It's delivered as a single message when the task is complete. Don't be confused by the word "Streamable". That's just there because it's using SSE to stream a series of JSON-RPC messages from the Server to the Client. But the Response to any specific Request is a single monolithic message. In the this space, an LLM that supports streaming is sending the response to a request as partials as they are generated. This allows you to present the results faster and give a lower perceived latency. MCP *does not* support this by the current specs. As I said, you can extend MCP and provide these partials in ProgresNotification messages. Then you are using a non-standard spec extension.