Deep Neural Networks (DNNs) have achieved excellent performance in intelligent applications. Nevertheless, it is elusive for devices with limited resources to support computationally intensive DNNs, while employing the cloud may lead to prohibitive latency. Better solutions are exploiting edge computing and reducing unnecessary computation. Multi-exit DNN based on the early exit mechanism has an impressive effect in the latter, and in edge computing paradigm, model partition on multi-exit chain DNNs is proved to accelerate inference effectively. However, despite reducing computations to some extent, multiple exits may lead to instability of performance due to variable sample quality, performance inferior to the original model especially in the worst case. Furthermore, nowadays DNNs are universally characterized by a directed acyclic graph (DAG), complicating the partition of multi-exit DNN exceedingly. To solve the issues, in this paper, considering online exit prediction and model execution optimization for multi-exit DNN, we propose a Dynamic Path based DNN Synergistic inference acceleration framework (DPDS), where exit designators are designed to avoid iterative entry for exits; to further promote computational synergy in the edge, the multi-exit DNN is dynamically partitioned according to network environment to achieve fine-grained computing offloading. Experimental results show that DPDS can significantly accelerate DNN inference by 1.87× to 6.78×.