No, it is send just zeros over generated by the CPU (if you know about linux [/dev/zero]) and the other computer is just dumping it [/dev/null]
I think you got your units mixed up. Network speeds are usualy told by using megabits (Mb) and not megabytes (MB), also a MiB is 1024 bytes and a MB is 1000 bytes, so 12,8MiB/s is not 1,6MB/s: convertion, although many uses just MB for MiB...
And you can't get the full 100mbps (12.5MB/s) anyways. but to maximize your upload speed: Disconnect any other device than the on you want to upload from, and stop any programs on that device that may use any network at all.
Nah, I was getting 1.6 Megabytes (per second) and 100 Megabits (per second) is the theoretical max so 100 Megabits(per second) is 12.5 Megabytes(per second) and 1.6 Megabytes (per second) divided by 12.5 Megabytes (per second) is 0.128 or 12.8% of what I should be getting.
NOTE: ALL UNIT CONVERSIONS FOR THIS REPLY WERE SUPPLIED BY GOOGLE.
*but maybe not the quotes...