Given a graph G and a parameter k, the Chordal Vertex Deletion (CVD) problem asks whether there exists a subset U ⊆ V (G) of size at most k that hits all induced cycles of size at least 4. The existence of a polynomial kernel for CVD was a well-known open problem in the field of Parameterized Complexity. Recently, Jansen and Pilipczuk resolved this question affirmatively by designing a polynomial kernel for CVD of size O(k 161 log 58 k), and asked whether one can design a kernel of size O(k 10 ). While we do not completely resolve this question, we design a significantly smaller kernel of size O(k 12 log 10 k), inspired by the O(k 2 )-size kernel for Feedback Vertex Set. Furthermore, we introduce the notion of the independence degree of a vertex, which is our main conceptual contribution.Data reduction techniques are widely applied to deal with computationally hard problems in real world applications. It has been a long-standing challenge to formally express the efficiency and accuracy of these "pre-processing" procedures. The framework of parameterized complexity turns out to be particularly suitable for a mathematical analysis of pre-processing heuristics. Formally, in parameterized complexity each problem instance is accompanied by a parameter k, and we say that a parameterized problem is fixed-parameter tractable (FPT) if there is an algorithm that solves the problem in time f (k) · |I| O(1) , where |I| is the size of the input and f is a computable function of the parameter k alone. Kernelization is the subarea of parameterized complexity that deals with the mathematical analysis of pre-processing heuristics. A parameterized problem is said to admit a polynomial kernel if there is a polynomial-time algorithm (the degree of polynomial is independent of the parameter k), called a kernelization algorithm, that reduces the input instance down to an instance whose size is bounded by a polynomial p(k) in k, while preserving the answer. This reduced instance is called a p(k)-kernel for the problem. Observe that if a problem has a kernelization algorithm, then it is also has an FPT algorithm.Kernelization appears to be an interesting computational approach not only from a theoretical perspective, but also from a practical perspective. There are many real-world applications where even very simple preprocessing can be surprisingly effective, leading to significant size-reduction of the input. Kernelization is a natural tool for measuring the quality of existing preprocessing rules proposed for specific problems as well as for designing new powerful such rules. The most fundamental question in the field of kernelization is:Let Π be parameterized problem that admits an FPT algorithm. Then, does Π admit a polynomial kernel?In recent times, the study of kernelization, centred on the above question, has been one of the main areas of research in parameterized complexity, yielding many new important contributions to theory. These include general results showing that certain classes of parameterized problems have polynomial k...