While the paper suggests that patchwise inference generally decreases peak memory usage across various models, our experimentation reveals that the peak memory comparisons between patchwise and non-patchwise inference models such as mcunet-vww2, mcunet-in2, and mcunet-in3 exhibit higher peak memory usage with patchwise inference.

While the paper suggests that patchwise inference generally decreases peak memory usage across various models, our experimentation reveals that the peak memory comparisons between patchwise and non-patchwise inference models such as
mcunet-vww2,mcunet-in2, andmcunet-in3exhibit higher peak memory usage with patchwise inference.