2012
DOI: 10.1029/2012jd017725
|View full text |Cite
|
Sign up to set email alerts
|

Sea surface temperatures in cooler climate stages bear more similarity with atmospheric CO2 forcing

Abstract: [1] The interglacial Marine Isotope Stage (MIS) 11 received special attention due to its remarkable resemblance with present-day climate. Based on synchronicity of marine, ice sheet and terrestrial proxy responses, warm episodes with intervening cool phase(s) at MIS 11 are qualitatively established. Here we quantitatively evaluate 15 climate proxies during 368-552 kyr intervals adopting a novel long-range cross-correlation approach and information theory based similarity measures. We also estimate the informat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
12
0

Year Published

2014
2014
2020
2020

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 12 publications
(12 citation statements)
references
References 43 publications
0
12
0
Order By: Relevance
“…This exchange of information essentially has direction with no bearing on their common history or inputs, unlike cross correlation. Therefore, it can be utilized for determining the cause and effect relationship between two variables [ Das Sharma et al , ; Vichare et al , ]. Transfer entropy between two random variables or processes x and y is mathematically represented as TExy(τ)=P(y(t+τ),y(t),x(t))log2()P(y(t+τ),y(t),x(t))P(y(t))P(x(t),y(t))P(y(t+τ),y(t)) where P(y(t + s), x(t)) is the joint probability of y(t + s) and x(t) and P(y(t + τ ), y(t), x(t)) is the joint probability of y (t + τ ), y (t) and x (t).…”
Section: Transfer Entropy Methods To Evaluate the Cause And Effect Relmentioning
confidence: 99%
“…This exchange of information essentially has direction with no bearing on their common history or inputs, unlike cross correlation. Therefore, it can be utilized for determining the cause and effect relationship between two variables [ Das Sharma et al , ; Vichare et al , ]. Transfer entropy between two random variables or processes x and y is mathematically represented as TExy(τ)=P(y(t+τ),y(t),x(t))log2()P(y(t+τ),y(t),x(t))P(y(t))P(x(t),y(t))P(y(t+τ),y(t)) where P(y(t + s), x(t)) is the joint probability of y(t + s) and x(t) and P(y(t + τ ), y(t), x(t)) is the joint probability of y (t + τ ), y (t) and x (t).…”
Section: Transfer Entropy Methods To Evaluate the Cause And Effect Relmentioning
confidence: 99%
“…To quantify actual information explicitly exchanged between these two variables x and y, an information measure known as Transfer Entropy was introduced by [34]. This overcomes the limitations posed by measures of correlation and other entropy metrics by enabling us to distinctly quantify actual information exchanged along with the directional flow of information between any two variables with no bearing on their common history or inputs [34,12,11,3,47]. The TE from a process x to another process y after a time lag τ is the quantity of information that the state of y has at a time t + τ based exclusively on the state of x at time t. This can be represented by the following expression [28]:…”
Section: Transfer Entropy and Directionality Indexmentioning
confidence: 99%
“…In the presence of multiple drivers, estimation of relative contribution of each variable/proxy to the observed response signal assumes importance. To enable this, [48] have introduced a measure known as normalized transfer entropy (NTE ) by accounting for the amount of information stored in x(t) and y(t + τ ) which is given as [11]:…”
Section: Transfer Entropy and Directionality Indexmentioning
confidence: 99%
“…This is accomplished by applying a moving average filter to the time series (Carbone et al 2004), which is akin to detrending the time series in order to extract statistically meaningful information. In particular, adequate caution should be exercised to avoid both under-and oversmoothing of data (Das Sharma et al 2012). Several moving average time window sizes (w s ) are examined.…”
Section: Data Used and Shannon Entropy Estimationmentioning
confidence: 99%