Accurate depiction of meteorological conditions, especially within the planetary boundary layer (PBL), is important for air pollution modeling, and PBL parameterization schemes play a critical role in simulating the boundary layer. This study examines the sensitivity of the performance of the Weather Research and Forecast (WRF) model to the use of three different PBL schemes [Mellor-Yamada-Janjic (MYJ), Yonsei University (YSU), and the asymmetric convective model, version 2 (ACM2)]. Comparison of surface and boundary layer observations with 92 sets of daily, 36-h high-resolution WRF simulations with different schemes over Texas in July-September 2005 shows that the simulations with the YSU and ACM2 schemes give much less bias than with the MYJ scheme. Simulations with the MYJ scheme, the only local closure scheme of the three, produced the coldest and moistest biases in the PBL. The differences among the schemes are found to be due predominantly to differences in vertical mixing strength and entrainment of air from above the PBL. A sensitivity experiment with the ACM2 scheme confirms this diagnosis.
The record-setting 2011 Texas drought/heat wave is examined to identify physical processes, underlying causes, and predictability. October 2010–September 2011 was Texas’s driest 12-month period on record. While the summer 2011 heat wave magnitude (2.9°C above the 1981–2010 mean) was larger than the previous record, events of similar or larger magnitude appear in preindustrial control runs of climate models. The principal factor contributing to the heat wave magnitude was a severe rainfall deficit during antecedent and concurrent seasons related to anomalous sea surface temperatures (SSTs) that included a La Niña event. Virtually all the precipitation deficits appear to be due to natural variability. About 0.6°C warming relative to the 1981–2010 mean is estimated to be attributable to human-induced climate change, with warming observed mainly in the past decade. Quantitative attribution of the overall human-induced contribution since preindustrial times is complicated by the lack of a detected century-scale temperature trend over Texas. Multiple factors altered the probability of climate extremes over Texas in 2011. Observed SST conditions increased the frequency of severe rainfall deficit events from 9% to 34% relative to 1981–2010, while anthropogenic forcing did not appreciably alter their frequency. Human-induced climate change increased the probability of a new temperature record from 3% during the 1981–2010 reference period to 6% in 2011, while the 2011 SSTs increased the probability from 4% to 23%. Forecasts initialized in May 2011 demonstrate predictive skill in anticipating much of the SST-enhanced risk for an extreme summer drought/heat wave over Texas.
Houston. We suggest that the elevated flash densities could result from several factors, including, 1) the convergence due to the urban heat island effect, and 2) the increasing levels of air pollution from anthropogenic sources producing numerous small droplets and thereby suppressing mean droplet size.The latter effect would enable more cloud water to reach the mixed phase region where it is involved in the formation of precipitation and the separation of electric charge, leading to an enhancement of lightning.
[1] This paper documents various unresolved issues in using surface temperature trends as a metric for assessing global and regional climate change. A series of examples ranging from errors caused by temperature measurements at a monitoring station to the undocumented biases in the regionally and globally averaged time series are provided. The issues are poorly understood or documented and relate to micrometeorological impacts due to warm bias in nighttime minimum temperatures, poor siting of the instrumentation, effect of winds as well as surface atmospheric water vapor content on temperature trends, the quantification of uncertainties in the homogenization of surface temperature data, and the influence of land use/land cover (LULC) change on surface temperature trends. Because of the issues presented in this paper related to the analysis of multidecadal surface temperature we recommend that greater, more complete documentation and quantification of these issues be required for all observation stations that are intended to be used in such assessments. This is necessary for confidence in the actual observations of surface temperature variability and long-term trends.Citation: Pielke, R. A., Sr., et al. (2007), Unresolved issues with the assessment of multidecadal global land surface temperature trends,
The Second Texas Air Quality Study (TexAQS II) was conducted in eastern Texas during 2005 and 2006. This 2‐year study included an intensive field campaign, TexAQS 2006/Gulf of Mexico Atmospheric Composition and Climate Study (GoMACCS), conducted in August–October 2006. The results reported in this special journal section are based on observations collected on four aircraft, one research vessel, networks of ground‐based air quality and meteorological (surface and radar wind profiler) sites in eastern Texas, a balloon‐borne ozonesonde‐radiosonde network (part of Intercontinental Transport Experiment Ozonesonde Network Study (IONS‐06)), and satellites. This overview paper provides operational and logistical information for those platforms and sites, summarizes the principal findings and conclusions that have thus far been drawn from the results, and directs readers to appropriate papers for the full analysis. Two of these findings deserve particular emphasis. First, despite decreases in actual emissions of highly reactive volatile organic compounds (HRVOC) and some improvements in inventory estimates since the TexAQS 2000 study, the current Houston area emission inventories still underestimate HRVOC emissions by approximately 1 order of magnitude. Second, the background ozone in eastern Texas, which represents the minimum ozone concentration that is likely achievable through only local controls, can approach or exceed the current National Ambient Air Quality Standard of 75 ppbv for an 8‐h average. These findings have broad implications for air quality control strategies in eastern Texas.
Computer models that project future climates are widely used for adaptation, mitigation and resilience planning. More than 50 such models were assessed and compared in the latest round of the Coupled Model Intercomparison Project, phase 6 (CMIP6), run by the World Climate Research Programme 1 . It is crucial that researchers know the best way to use those outputs to provide consistent information for climate science and policy.We are climate modellers and analysts who develop, distribute and use these projections. We know scientists must treat them with great care. Users beware: a subset of the newest generation of models are 'too hot' 2 and project climate warming in response to carbon dioxide emissions that might be larger than that supported by other evidence [3][4][5][6][7] . Some suggest that doubling atmospheric CO 2 concentrations from pre-industrial levels will result in warming above 5 °C, for example. This was not the case in previous generations of simpler models.Earth is a complicated system of interconnected oceans, land, ice and atmosphere, and no computer model could ever simulate every aspect of it exactly. Models vary in theirThe sixth and latest IPCC assessment weights climate models according to how well they reproduce other evidence. Now the rest of the community should do the same.
The influence of horizontal grid resolution (dx) and horizontal diffusion on the maximum velocity of the sea breeze circulation is discussed using the results of a two-dimensional numerical model. The computed maximum updraft (WMAx) of the sea breeze decreases as dx increases when dx is larger than a certain value. The maximum velocity, however, approaches a constant as ox is decreased to values less than the value. Increasing the grid interval is similar to smoothing the peak values of the velocity. The peak values, therefore, are decreased as the grid size is increased. Further, it was found that the WMAx is significantly weakened by horizontal diffusion. The magnitude of WMAX at a given point is not very meaningful, since the value can be altered by changing the grid size and the smoothing method. The area-weighted WMAX appears to be more physically significant in numerically simulated results. Therefore, use should be made of an appropriately fine grid for studying a given phenomena.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.