Considering the dynamic variability of the vehicular edge environment and the limited edge servers resources, this paper proposes a joint task caching and computation offloading scheme based on deep reinforcement learning (DRL). Considering that the motion trajectories of different vehicles overlap and their task requests may be the same, this paper designs a vehicle-edge-cloud computing framework to fully use the cache resources of vehicles, edge servers, and clouds to reduce task processing delays and energy consumption. Secondly, this paper adopts a method of partial offloading and collaboration between edge servers, which fully utilizes the computing resources of vehicles, edge servers, and the cloud, reducing the burden of vehicles and edge servers. In addition, this paper proposes a DRL-based task offloading scheme to obtain better task caching and offloading strategies. The simulation results show that the scheme proposed in this article performs better compared to other schemes and effectively reduces the latency and energy consumption of task processing.