Pope Francis wrote in his Encyclical Letter Laudato Si’: On Care for Our Common Home: “Instead of resolving the problems of the poor and thinking of how the world can be different, some can only propose a reduction ...Pope Francis wrote in his Encyclical Letter Laudato Si’: On Care for Our Common Home: “Instead of resolving the problems of the poor and thinking of how the world can be different, some can only propose a reduction in the birth rate.” … “To blame population growth instead of extreme and selective consumerism on the part of some is one way of refusing to face the issues.” Here, we test the hypothesis that population size does not matter. We do so in terms of the effect of the size of the human population on its emission of greenhouse gases. We find that the hypothesis is false = PO-PULATION MATTERS. Ceteris paribus, the larger the population of human beings on Planet Earth, the more difficult it will be to reduce, and finally eliminate, the emission of greenhouse gases by humanity and, thereby, constrain human-caused climate change = Anthropogenic Global Warming.展开更多
In our original study we crafted trajectories for developed and developing countries that phased-out greenhouse gas emissions during 2015-2065 such that the maximum global warming does not exceed the 2℃ threshold ado...In our original study we crafted trajectories for developed and developing countries that phased-out greenhouse gas emissions during 2015-2065 such that the maximum global warming does not exceed the 2℃ threshold adopted by the UN Framework Convention on Climate Change, and the cumulative emissions for developed and developing countries are identical. Here we examine the effects of increasing the start year from 2015 to 2030 in 5-year intervals, and the phase-out period from 50 to 100 years in 10-year intervals. We find that phase-out during 2020-2100 is optimal. This phase-out increases the year of peak emission from 2015 to 2030 for developed countries and from 2042 to 2053 for developing countries. It also increases the time from peak emissions to zero emissions from 50 to 70 years for developed countries and from 23 to 47 years for developing countries. Both outcomes should facilitate agreement of the Revised Fair Plan by the UNFCCC.展开更多
We apply Singular Spectrum Analysis to four datasets of observed global-mean near-surface temperature from start year to through 2012: HadCRU (to = 1850), NOAA (to = 1880), NASA (to = 1880), and JMA (to = 1891). For e...We apply Singular Spectrum Analysis to four datasets of observed global-mean near-surface temperature from start year to through 2012: HadCRU (to = 1850), NOAA (to = 1880), NASA (to = 1880), and JMA (to = 1891). For each dataset, SSA reveals a trend of increasing temperature and several quasi-periodic oscillations (QPOs). QPOs 1, 2 and 3 are predictable on a year-by-year basis by sine waves with periods/amplitudes of: 1) 62.4 years/0.11°C;2) 20.1 to 21.4 years/0.04°C to 0.05°C;and 3) 9.1 to 9.2 years/0.03°C to 0.04°C. The remainder of the natur°l variability is not predictable on a year-by-year basis. We represent this noise by its 90 percent confidence interval. We combine the predictable and unpredictable natural variability with the temperature changes caused by the 11-year solar cycle and humanity, the latter for both the Reference and Revised-Fair-Plan scenarios for future emissions of greenhouse gases. The resulting temperature departures show that we have moved from the first phase of learning—Ignorance—through the second phase—Uncertainty—and are now entering the third phase—Resolution—when the human-caused signal is much larger than the natural variability. Accordingly, it is now time to transition to the post-fossil-fuel age by phasing out fossil-fuel emissions from 2020 through 2100.展开更多
A maximum global-mean warming of 2°C above preindustrial temperatures has been adopted by the United Nations Framework Convention on Climate Change to “prevent dangerous anthropogenic interference with the clima...A maximum global-mean warming of 2°C above preindustrial temperatures has been adopted by the United Nations Framework Convention on Climate Change to “prevent dangerous anthropogenic interference with the climate system”. Attempts to find agreements on emissions reductions have proved highly intractable because industrialized countries are responsible for most of the historical emissions, while developing countries will produce most of the future emissions. Here we present a Fair Plan for reducing global greenhouse-gas emissions. Under the Plan, all countries begin mitigation in 2015 and reduce greenhouse-gas emissions to zero in 2065. Developing countries are required to follow a mitigation trajectory that is less aggressive in the early years of the Plan than the mitigation trajectory for developed countries. The trajectories are chosen such that the cumulative emissions of the Kyoto Protocol’s Annex B (developed) and non-Annex B (developing) countries are equal. Under this Fair Plan the global-mean warming above preindustrial temperatures is held below 2°C.展开更多
The instrumental temperature records are affected by both external climate forcings—in particular, the increase of long-lived greenhouse gas emissions—and natural, internal variability. Estimates of the value of equ...The instrumental temperature records are affected by both external climate forcings—in particular, the increase of long-lived greenhouse gas emissions—and natural, internal variability. Estimates of the value of equilibrium climate sensitivity—the change in global-mean equilibrium near-surface temperature due to a doubling of the pre-industrial CO2 concentration—and other climate parameters using these observational records are affected by the presence of the internal variability. A different realization of the natural variability will result in different estimates of the values of these climate parameters. In this study we apply Bayesian estimation to simulated temperature and ocean heat-uptake records generated by our Climate Research Group’s Simple Climate Model for known values of equilibrium climate sensitivity, T2x direct sulfate aerosol forcing in reference year 2000, FASA, and oceanic heat diffusivity, ΔT2x. We choose the simulated records for one choice of values of the climate parameters to serve as the synthetic observations. To each of the simulated temperature records we add a number of draws of the quasi-periodic oscillations and stochastic noise, determined from the observed temperature record. For cases considering only values of ΔT2x and/or κ, the Bayesian estimation converges to the value(s) of ΔT2x and/or κ used to generate the synthetic observations. However, for cases studying FASA, the Bayesian analysis does not converge to the “true” value used to generate the synthetic observations. We show that this is a problem of low signal-to-noise ratio: by substituting an artificial, continuously increasing sulfate record, we greatly improve the value obtained through Bayesian estimation. Our results indicate Bayesian learning techniques will be useful tools in constraining the values of ΔT2x and κ but not FASA In our Group’s future work we will extend the methods used here to the observed, instrumental records of global-mean temperature increase and ocean heat uptake.展开更多
Earth is the only habitable planet in the solar system and beyond in interstellar space for a distance that would take us at least 80,000 years to traverse at the speed of Voyager 1. Thus our home planet is “This Isl...Earth is the only habitable planet in the solar system and beyond in interstellar space for a distance that would take us at least 80,000 years to traverse at the speed of Voyager 1. Thus our home planet is “This Island Earth”. Here we use our Simple (engineering-type) Climate Model to calculate the change in global-mean near-surface air temperature from 1765 through the third millennium for historical emissions and two scenarios of future emissions of greenhouse gases: (1) a Reference scenario of unabated emissions, and (2) our Fair Plan scenario wherein emissions are phased out to zero from 2020 to 2100. The temperature change for the Reference cases increases to 5.2°C (9.4°F) in 2225 and remains there for at least 40 human generations. By design, the temperature change for the Fair Plan increases only to 2°C (3.6°F)—the limit adopted by the UN Framework Convention on Climate Change “to prevent dangerous anthropogenic interference with the climate system”—in 2082 and thereafter decreases through the remainder of the millennium. Accordingly, we need to adopt the Fair Plan to safeguard the climate of “This Island Earth”.展开更多
Earth’s climate future is in the hands of humanity. If emissions of greenhouse gases remain unabated, Earth’s climate will return to the climate of the Late Eocene, 35 million years ago, when sea level was 73 meters...Earth’s climate future is in the hands of humanity. If emissions of greenhouse gases remain unabated, Earth’s climate will return to the climate of the Late Eocene, 35 million years ago, when sea level was 73 meters (240 feet) higher than today. Should that occur, many coastal cities around the world would be inundated. Moreover the Global Warming of this unabated Reference case will be comparable to the Global Warming from the Last Glacial Maximum 21,000 years ago to the beginning of the Holocene interglacial climate 11,000 years ago. However, this human-caused Global Warming would occur 50 times faster than that caused by nature. Alternatively, humanity can mitigate greenhouse-gas emissions to keep Global Warming below the 2°C maximum adopted by the United Nations Framework Convention on Climate Change “to prevent dangerous anthropogenic interference with the climate system”. This mitigation can either be done rapidly, as in the “80/50” Plan to reduce greenhouse-gas emissions 80% by 2050, or much more slowly, from 2020 to 2100, as in the Fair Plan to Safeguard Earth’s Climate. The Fair Plan is a compromise between doing nothing, as in the Reference case, and rapidly reducing greenhouse-gas emissions, as in the 80/50 Plan. Regardless of the Plan chosen to reduce greenhouse-gas emissions to keep Global Warming below the UNFCCC limit of 2°C (3.6°F), it should not be tantamount to our saying to one of our planetary spacecraft, Bon Voyage, call us when you get to your planetary destination. Rather, as with our spacecraft, the chosen climate-change policy should be monitored throughout the 21st century and Midcourse Corrections made thereto as needed to keep our “Climate Spacecraft” on track to achieve its “Climate Target”.展开更多
This study shows that the heretofore assumed condition for no temperature-profile (TP)/lapse-rate feedback, for all altitudes z, or , in fact yields a negative feedback. The correct condition for no TP feedback is for...This study shows that the heretofore assumed condition for no temperature-profile (TP)/lapse-rate feedback, for all altitudes z, or , in fact yields a negative feedback. The correct condition for no TP feedback is for all z, where Ts is the surface temperature. This condition translates into a uniform increase (decrease) in lapse rate with altitude for an increase (decrease) in Ts. The temperature changes caused by a change in solar irradiance and/or planetary albedo satisfy the condition for no TP feedback. The temperature changes caused by a change in greenhouse gas concentration do not satisfy the condition for no TP feedback and, instead, yield a positive feedback.展开更多
Measurements show that the Earth’s global-average near-surface temperature has increased by about 0.8℃ since the 19th century. It is critically important to determine whether this global warming is due to natural ca...Measurements show that the Earth’s global-average near-surface temperature has increased by about 0.8℃ since the 19th century. It is critically important to determine whether this global warming is due to natural causes, as contended by climate contrarians, or by human activities, as argued by the Intergovernmental Panel on Climate Change. This study updates our earlier calculations which showed that the observed global warming was predominantly human-caused. Two independent methods are used to analyze the temperature measurements: Singular Spectrum Analysis and Climate Model Simulation. The concurrence of the results of the two methods, each using 13 additional years of temperature measurements from 1998 through 2010, shows that it is humanity, not nature, that has increased the Earth’s global temperature since the 19th century. Humanity is also responsible for the most recent period of warming from 1976 to 2010. Internal climate variability is primarily responsible for the early 20th century warming from 1904 to 1944 and the subsequent cooling from 1944 to 1976. It is also found that the equilibrium climate sensitivity is on the low side of the range given in the IPCC Fourth Assessment Report.展开更多
Floods are one of nature's most destructive disasters because of the immense damage to land,buildings,and human fatalities.It is difficult to forecast the areas that are vulnerable to flash flooding due to the dyn...Floods are one of nature's most destructive disasters because of the immense damage to land,buildings,and human fatalities.It is difficult to forecast the areas that are vulnerable to flash flooding due to the dynamic and complex nature of the flash floods.Therefore,earlier identification of flash flood susceptible sites can be performed using advanced machine learning models for managing flood disasters.In this study,we applied and assessed two new hybrid ensemble models,namely Dagging and Random Subspace(RS)coupled with Artificial Neural Network(ANN),Random Forest(RF),and Support Vector Machine(SVM)which are the other three state-of-the-art machine learning models for modelling flood susceptibility maps at the Teesta River basin,the northern region of Bangladesh.The application of these models includes twelve flood influencing factors with 413 current and former flooding points,which were transferred in a GIS environment.The information gain ratio,the multicollinearity diagnostics tests were employed to determine the association between the occurrences and flood influential factors.For the validation and the comparison of these models,for the ability to predict the statistical appraisal measures such as Freidman,Wilcoxon signed-rank,and t-paired tests and Receiver Operating Characteristic Curve(ROC)were employed.The value of the Area Under the Curve(AUC)of ROC was above 0.80 for all models.For flood susceptibility modelling,the Dagging model performs superior,followed by RF,the ANN,the SVM,and the RS,then the several benchmark models.The approach and solution-oriented outcomes outlined in this paper will assist state and local authorities as well as policy makers in reducing flood-related threats and will also assist in the implementation of effective mitigation strategies to mitigate future damage.展开更多
In our Fair Plan 5 paper, we compared the CO2 emissions of the 80%-Emission-Reduction-By-2050 (80/50) Plan with the CO2 emissions of our Fair Plan to Safeguard Earth’s Climate. We found that the 80/50 Plan reduced CO...In our Fair Plan 5 paper, we compared the CO2 emissions of the 80%-Emission-Reduction-By-2050 (80/50) Plan with the CO2 emissions of our Fair Plan to Safeguard Earth’s Climate. We found that the 80/50 Plan reduced CO2 emissions more rapidly than necessary to achieve the principal objective of the Fair Plan: to keep Global Warming (GW) within the 2℃ (3.6℉) limit adopted by the UN Framework Convention on Climate Change (UNFCCC) “to prevent dangerous anthropogenic interference with the climate system”. Here, we ask the “What If” question: “What would the GW of the 80/50 Plan be post 2100 if its CO2 emissions post 2100 were kept at their 2100 value?” We find that although the GW of the 80/50 Plan decreases slightly over part of the 21st century, it does not remain constant thereafter. Rather, the GW of the 80/50 Plan begins to increase in 2088, exceeds that of the Fair Plan beginning in 2230, exceeds the 2℃ (3.6℉) limit of the UNFCCC in 2596, and ends the millennium at 2.7℃ (4.8℉). Thus, not only does the 80/50 Plan phase out humanity’s CO2 emissions faster than necessary to fulfill the UNFCCC constraint, it also fails that constraint if its CO2 emissions post 2100 are kept at their 2100 value. Accordingly, we believe that the Fair Plan to Safeguard Earth’s Climate is superior to the 80/50 Plan.展开更多
For low-income communities in South Africa,coal is the most common solid fuel which is burnt in a variety of devices,including imbaulas and cast-iron stoves.The present work was conducted with the aim of determining t...For low-income communities in South Africa,coal is the most common solid fuel which is burnt in a variety of devices,including imbaulas and cast-iron stoves.The present work was conducted with the aim of determining the effect of the fuel particle size on the performance of coal,typically sourced in low-income households in townships in South Africa,and to subsequently compare the performance with a feed char of a common cast iron stove.Four fuel particle sizes of 15,20,30,and 40 mm,as well as a composite of the sizes were tested at 550C,against their untreated coal analogues to evaluate the thermal performance of each fuel.The thermal performance assessment metrics are ignition time,water boiling time,heat transfer and combustion efficiencies,while CO and CO_(2)emissions were measured for the calculations of CO/CO_(2)ratios.Ignition times were found to decrease from coals to chars and to decrease with increasing particle size.The effects of fuel type on the water boiling time were only observed in the later stages of the burn cycle,with the char boiling a 2 L batch of water in an average 24 min,while the coals reported an average boiling time of 20 min.Heat transfer efficiencies showed no significant variation with fuel type or particle size,with the average efficiency for the coals and that of the chars being around 66%.The fuels’performance was better gauged by the combustion efficiency,which was found to improve marginally from the coal fuels to the chars,and to increase with increasing particle size.Results from this testwork could contribute to the performance inventories from the combustion of domestic coal mined in South Africa in a typical cast iron stove which is used in informal settlements.展开更多
With the election of Donald Trump as President of the United States of America, it appears likely that the initiation of mitigation of human-caused Global-Warming/Climate-Change will be delayed many years. Accordingly...With the election of Donald Trump as President of the United States of America, it appears likely that the initiation of mitigation of human-caused Global-Warming/Climate-Change will be delayed many years. Accordingly, here we calculate the Emission Phaseout Duration, D = YE - YS, where YS and YE are the Start and End Years of the emissions reduction, for YS = 2020, 2025 and 2030, and maximum Global Warming targets, ΔTmax = 2.0°C, 1.9°C, 1.8°C, 1.7°C, 1.6°C and 1.5°C. The 2.0°C and 1.5°C maxima are the “Hard” and “Aspirational” targets of the 2015 Paris Climate Agreement. We find that D decreases with increasing YS from 2020, and with decreasing ΔTmax. In particular, D decreases from: 1) 76 years for YS = 2020 to 53 years for YS = 2030 for ΔTmax = 2.0°C, and 2) 34 years for YS = 2020 to 7 years for YS = 2030 for ΔTmax = 1.5°C. Thus, delaying the initiation of the phaseout of greenhouse-gas emissions from 2020 to 2030 makes it more difficult to achieve ΔTmax = 2.0°C and impossible to achieve ΔTmax = 1.5°C.展开更多
Surface ozone(O3)is a secondary pollutant harmful to human health and a greenhouse gas which is one of the prime climate forcers.Due to the clean atmospheric environment of the Antarctic region and given the complexit...Surface ozone(O3)is a secondary pollutant harmful to human health and a greenhouse gas which is one of the prime climate forcers.Due to the clean atmospheric environment of the Antarctic region and given the complexity of O3 chemistry,the observation of surface O3 variability in this region is necessary in the quest to better understand the potential sources and sink of polar surface O3.In this paper,we highlighted our observations on O3 variability at the Great Wall Station(GWS)during austral summer in December 2018 and January 2019.The continuous surface O3 measurement at the GWS,Antarctica was carried out using the Ecotech Ozone analyzer.To understand the roles of the meteorological conditions on the temporal variations of O3,meteorological data was obtained from the conventional auto-observational station at the GWS.The Hybrid Single-Particle Lagrangian Integrated Trajectory(HYSPLIT)model was employed to investigate the air mass transport over the region.The observed austral summer surface O3 concentrations at the GWS exhibited variability and were significantly lower than those previously observed at other permanent coastal stations in Antarctica.The surface ozone variability at the GWS was strongly influenced by the synoptic change of air mass origin although the roles of photochemistry production and destruction were still uncertain.Marine characteristics and stable surface O3 characterized the air masses that reached the GWS.The unique characteristic of surface O3 at the coastal site of GWS was emphasized by its synoptic air mass characteristics,which displayed a significant influence on surface O3 variability.Air mass that traveled over the ocean with relatively shorter distance was linked to the lower O3 level,whereby the marine transport of reactive bromine(Br)species was thought to play a significant role in the tropospheric chemistry that leads to O3 destruction.Meanwhile,the diurnal variation indicated that the O3 background concentration levels were not strongly associated with the local atmospheric conditions.展开更多
The most recent US Congressional climate bill, H.R.5271 in 2014, proposes to reduce US emissions of carbon dioxide relative to their 2005 value by 80% in 2050. This bill does not provide a rationale for this rapid pha...The most recent US Congressional climate bill, H.R.5271 in 2014, proposes to reduce US emissions of carbon dioxide relative to their 2005 value by 80% in 2050. This bill does not provide a rationale for this rapid phase down of CO2 emissions. In 2012, we crafted a Fair Plan to Safeguard Earth’s Climate such that: 1) The cumulative trade-adjusted CO2 emissions by the developing countries equal the cumulative trade-adjusted CO2 emissions by the developed countries;2) The maximum global warming above preindustrial temperature does not exceed the 2°C (3.6°F) chosen by the United Nations Framework Convention on Climate Change “to prevent dangerous anthropogenic interference with the climate system”;and 3) The phase out of CO2 emissions begins as late as possible in the 21st century and proceeds at the slowest possible pace, consistent with objectives 1 and 2. The Fair Plan begins in 2020 and reduces the world’s emissions to zero in 2100. In the Fair Plan the emissions of the developed countries, including the United States, reach 80% below their 2005 values in 2094, that is, 44 years later than proposed by H.R.5271. While it is imperative that humanity begins to wean itself from fossil fuels no later than 2020, the transition from fossil to non-fossil energy need not be completed before 2100 if all countries follow their Fair Plan trajectories.展开更多
The nonlinearity of the strain energy at an interval period of applying seismic load on the geostructures makes it difficult for a seismic designer to makes appropriate engineering judgments timely.The nonlinear stres...The nonlinearity of the strain energy at an interval period of applying seismic load on the geostructures makes it difficult for a seismic designer to makes appropriate engineering judgments timely.The nonlinear stress and strain analysis of an embankment is needed to evaluate by using a combination of suitable methods.In this study,a large-scale geostructure was seismically simulated and analyzed using the nonlinear finite element method(NFEM),and linear regression method which is a soft computing technique(SC)was applied for evaluating the results of NFEM,and it supports engineering judgment because the design of the geostructures is usually considered to be an inaccurate process owing to high nonlinearity of the large-scale geostructures seismic response and such nonlinearity may induce the complexity for decision making in geostructures seismic design.The occurrence of nonlinear stress and nonlinear strain probability distribution can be observed and density of stress and strain are predicted by using the histogram.The results of both the simulation from the NFEM and the linear regression method confirm the nonlinearity of strain energy and stress behavior have a close value of R2 and root-mean-square error(RMSE).The linear regression and histogram simulation shows the accuracy of NFEM results.The outcome of this study guides to improve engineering judgment quality for seismic analysis of an embankment through validating results of NFEM by employing appropriate soft computing techniques.展开更多
Previously we have used Singular Spectrum Analysis (SSA) to deconstruct the global-mean near-surface temperature observations of the Hadley Centre—Climate Research Unit that extend from 1850 through 2012. While SSA i...Previously we have used Singular Spectrum Analysis (SSA) to deconstruct the global-mean near-surface temperature observations of the Hadley Centre—Climate Research Unit that extend from 1850 through 2012. While SSA is a very powerful tool, it is rather like a statistical “black box” that gives little intuition about its results. Accordingly, here we use the simplest statistical tool to provide such intuition, the Simple Moving Average (SMA). Firstly we use a 21-year SMA. This reveals a nonlinear trend and an oscillation of about 60 years' length. Secondly we use a 61-year SMA on the raw observations. This yields a nonlinear trend. We subtract this trend from the raw observations and apply a 21-year SMA. This yields a Quasi-periodic Oscillation (QPO) with a period and amplitude of about 62.4 years and 0.11°C. This is the QPO we discovered in our 1994 Nature paper, which has come to be called the Atlantic Multidecadal Oscillation. We then subtract QPO-1 from the detrended observations and apply an 11-year SMA. This yields QPO-2 with a period and amplitude of about 21.0 years and 0.04°C. We subtract QPO-2 from the detrended observations minus QPO-1 and apply a 3-year SMA. This yields QPO-3 with a period and amplitude of about 9.1 years and 0.03°C. QPOs 1, 2 and 3 are sufficiently regular in period and amplitude that we fit them by sine waves, thereby yielding the above periods and amplitudes. We then subtract QPO-3 from the detrended observations minus QPOs 1 and 2. The result is too irregular in period and amplitude to be fit by a sine wave. Accordingly we represent this unpredictable part of the temperature observations by a Gaussian probability distribution (GPD) with a mean of zero and standard deviation of 0.08°C. The sum of QPOs 1, 2 and 3 plus the GPD can be used to project the natural variability of the global-mean near-surface temperature to add to, and be compared with, the continuing temperature trend caused predominantly by humanity’s continuing combustion of fossil fuels.展开更多
The coal combustion in cast-iron stoves leads to health hazards and air pollution.In this study the CO,SO2,NOx,PM and VOC emission concentrations were measured whilst combusting four fuel particle sizes(15,20,30,and 4...The coal combustion in cast-iron stoves leads to health hazards and air pollution.In this study the CO,SO2,NOx,PM and VOC emission concentrations were measured whilst combusting four fuel particle sizes(15,20,30,and 40 mm)as well as a composite of the sizes(all pre-devolatilized at a temperature of 550C)in a cast-iron stove.The results were compared to their raw coal analogues to evaluate the emission performance of each fuel type.Emission factors for NOx and SO2 were found to depend on the fuel nitrogen and sulphur contents in the coal and the combustion conditions used during pyrolysis.The PM,SO2 and VOC emissions show a strong dependence on the ash percentage and volatile matter yields,which both increased with increasing particle size.In addition,the PM,SO2 and VOC missions were found to only depend on particle size on a mechanistic level.The VOCs and PM emission factors are inversely correlated with particle size.The results from this study offer insight into the combustion environment in the Falkirk Union No 7 cast-iron stove as well as how this environment applies to low smoke fuels.The work contributes to the emission and performance inventories from South African domestic coal combustion in this stove used in informal settlements.展开更多
This study reports on the first investigation into the potential of luminescence dating to establish a chronological framework for the depositional sequences of the Sperchios delta plain, central Greece. A series of t...This study reports on the first investigation into the potential of luminescence dating to establish a chronological framework for the depositional sequences of the Sperchios delta plain, central Greece. A series of three borehole cores(20 m deep) and two shallow cores(4 m deep), from across the delta plain, were extracted, and samples were collected for luminescence dating. The luminescence ages of sand-sized quartz grains were obtained from small aliquots of quartz, using the Single-Aliquot Regenerative-dose(SAR) protocol.The equivalent dose determination included a series of tests and the selection of the Minimum Age Model(MAM) as the most appropriate statistical model. This made it possible to confirm the applicability of quartz Optically Stimulated Luminescence(OSL) dating to establish absolute chronology for deltaic sediments from the Sperchios delta plain.Testing age results of the five cores showed that the deltaic sediments were deposited during the Holocene.A relatively rapid deposition is implied for the top14 m possibly as a result of the deceleration in the rate of the sea-level rise and the transition to terrestrial conditions, while on the deeper parts, the reduced sedimentation rate may indicate a lagoonal or coastal environment.展开更多
A sequence of ground-based radar reflectivity images sampled in the 17 hours prior to, and during the landfall of Severe Tropical Cyclone Larry(2006) are presented and analyzed using Fourier and wavelet analysis techn...A sequence of ground-based radar reflectivity images sampled in the 17 hours prior to, and during the landfall of Severe Tropical Cyclone Larry(2006) are presented and analyzed using Fourier and wavelet analysis techniques. A range of mesoscale convective anomalies were detected, with characteristics and behavior consistent with vortex Rossby wave initiation. Cyclonically propagating eye-wall kinks, elongations and mesoscale reflectivity maxima were all observed throughout the sampling period, along with intense inner spiral bands. Various deep convective maxima propagated within the eye-wall at speeds consistent with predictions derived by linear barotropic wave theory. Three eye-wall breakdown episodes were observed during the study period, along with corresponding increases in storm-core asymmetric wave power and reductions in estimated storm intensity. Vortex Rossby wave initiated radial flows are also suggested by the presence of a possible mesovortex within a broken section of the eye-wall during landfall, and the outward ejection of filaments of deep convection from an adjacent inner spiral band. The possible influence of this wave activity upon the storm intensity and integrity is discussed.展开更多
文摘Pope Francis wrote in his Encyclical Letter Laudato Si’: On Care for Our Common Home: “Instead of resolving the problems of the poor and thinking of how the world can be different, some can only propose a reduction in the birth rate.” … “To blame population growth instead of extreme and selective consumerism on the part of some is one way of refusing to face the issues.” Here, we test the hypothesis that population size does not matter. We do so in terms of the effect of the size of the human population on its emission of greenhouse gases. We find that the hypothesis is false = PO-PULATION MATTERS. Ceteris paribus, the larger the population of human beings on Planet Earth, the more difficult it will be to reduce, and finally eliminate, the emission of greenhouse gases by humanity and, thereby, constrain human-caused climate change = Anthropogenic Global Warming.
文摘In our original study we crafted trajectories for developed and developing countries that phased-out greenhouse gas emissions during 2015-2065 such that the maximum global warming does not exceed the 2℃ threshold adopted by the UN Framework Convention on Climate Change, and the cumulative emissions for developed and developing countries are identical. Here we examine the effects of increasing the start year from 2015 to 2030 in 5-year intervals, and the phase-out period from 50 to 100 years in 10-year intervals. We find that phase-out during 2020-2100 is optimal. This phase-out increases the year of peak emission from 2015 to 2030 for developed countries and from 2042 to 2053 for developing countries. It also increases the time from peak emissions to zero emissions from 50 to 70 years for developed countries and from 23 to 47 years for developing countries. Both outcomes should facilitate agreement of the Revised Fair Plan by the UNFCCC.
文摘We apply Singular Spectrum Analysis to four datasets of observed global-mean near-surface temperature from start year to through 2012: HadCRU (to = 1850), NOAA (to = 1880), NASA (to = 1880), and JMA (to = 1891). For each dataset, SSA reveals a trend of increasing temperature and several quasi-periodic oscillations (QPOs). QPOs 1, 2 and 3 are predictable on a year-by-year basis by sine waves with periods/amplitudes of: 1) 62.4 years/0.11°C;2) 20.1 to 21.4 years/0.04°C to 0.05°C;and 3) 9.1 to 9.2 years/0.03°C to 0.04°C. The remainder of the natur°l variability is not predictable on a year-by-year basis. We represent this noise by its 90 percent confidence interval. We combine the predictable and unpredictable natural variability with the temperature changes caused by the 11-year solar cycle and humanity, the latter for both the Reference and Revised-Fair-Plan scenarios for future emissions of greenhouse gases. The resulting temperature departures show that we have moved from the first phase of learning—Ignorance—through the second phase—Uncertainty—and are now entering the third phase—Resolution—when the human-caused signal is much larger than the natural variability. Accordingly, it is now time to transition to the post-fossil-fuel age by phasing out fossil-fuel emissions from 2020 through 2100.
文摘A maximum global-mean warming of 2°C above preindustrial temperatures has been adopted by the United Nations Framework Convention on Climate Change to “prevent dangerous anthropogenic interference with the climate system”. Attempts to find agreements on emissions reductions have proved highly intractable because industrialized countries are responsible for most of the historical emissions, while developing countries will produce most of the future emissions. Here we present a Fair Plan for reducing global greenhouse-gas emissions. Under the Plan, all countries begin mitigation in 2015 and reduce greenhouse-gas emissions to zero in 2065. Developing countries are required to follow a mitigation trajectory that is less aggressive in the early years of the Plan than the mitigation trajectory for developed countries. The trajectories are chosen such that the cumulative emissions of the Kyoto Protocol’s Annex B (developed) and non-Annex B (developing) countries are equal. Under this Fair Plan the global-mean warming above preindustrial temperatures is held below 2°C.
文摘The instrumental temperature records are affected by both external climate forcings—in particular, the increase of long-lived greenhouse gas emissions—and natural, internal variability. Estimates of the value of equilibrium climate sensitivity—the change in global-mean equilibrium near-surface temperature due to a doubling of the pre-industrial CO2 concentration—and other climate parameters using these observational records are affected by the presence of the internal variability. A different realization of the natural variability will result in different estimates of the values of these climate parameters. In this study we apply Bayesian estimation to simulated temperature and ocean heat-uptake records generated by our Climate Research Group’s Simple Climate Model for known values of equilibrium climate sensitivity, T2x direct sulfate aerosol forcing in reference year 2000, FASA, and oceanic heat diffusivity, ΔT2x. We choose the simulated records for one choice of values of the climate parameters to serve as the synthetic observations. To each of the simulated temperature records we add a number of draws of the quasi-periodic oscillations and stochastic noise, determined from the observed temperature record. For cases considering only values of ΔT2x and/or κ, the Bayesian estimation converges to the value(s) of ΔT2x and/or κ used to generate the synthetic observations. However, for cases studying FASA, the Bayesian analysis does not converge to the “true” value used to generate the synthetic observations. We show that this is a problem of low signal-to-noise ratio: by substituting an artificial, continuously increasing sulfate record, we greatly improve the value obtained through Bayesian estimation. Our results indicate Bayesian learning techniques will be useful tools in constraining the values of ΔT2x and κ but not FASA In our Group’s future work we will extend the methods used here to the observed, instrumental records of global-mean temperature increase and ocean heat uptake.
文摘Earth is the only habitable planet in the solar system and beyond in interstellar space for a distance that would take us at least 80,000 years to traverse at the speed of Voyager 1. Thus our home planet is “This Island Earth”. Here we use our Simple (engineering-type) Climate Model to calculate the change in global-mean near-surface air temperature from 1765 through the third millennium for historical emissions and two scenarios of future emissions of greenhouse gases: (1) a Reference scenario of unabated emissions, and (2) our Fair Plan scenario wherein emissions are phased out to zero from 2020 to 2100. The temperature change for the Reference cases increases to 5.2°C (9.4°F) in 2225 and remains there for at least 40 human generations. By design, the temperature change for the Fair Plan increases only to 2°C (3.6°F)—the limit adopted by the UN Framework Convention on Climate Change “to prevent dangerous anthropogenic interference with the climate system”—in 2082 and thereafter decreases through the remainder of the millennium. Accordingly, we need to adopt the Fair Plan to safeguard the climate of “This Island Earth”.
文摘Earth’s climate future is in the hands of humanity. If emissions of greenhouse gases remain unabated, Earth’s climate will return to the climate of the Late Eocene, 35 million years ago, when sea level was 73 meters (240 feet) higher than today. Should that occur, many coastal cities around the world would be inundated. Moreover the Global Warming of this unabated Reference case will be comparable to the Global Warming from the Last Glacial Maximum 21,000 years ago to the beginning of the Holocene interglacial climate 11,000 years ago. However, this human-caused Global Warming would occur 50 times faster than that caused by nature. Alternatively, humanity can mitigate greenhouse-gas emissions to keep Global Warming below the 2°C maximum adopted by the United Nations Framework Convention on Climate Change “to prevent dangerous anthropogenic interference with the climate system”. This mitigation can either be done rapidly, as in the “80/50” Plan to reduce greenhouse-gas emissions 80% by 2050, or much more slowly, from 2020 to 2100, as in the Fair Plan to Safeguard Earth’s Climate. The Fair Plan is a compromise between doing nothing, as in the Reference case, and rapidly reducing greenhouse-gas emissions, as in the 80/50 Plan. Regardless of the Plan chosen to reduce greenhouse-gas emissions to keep Global Warming below the UNFCCC limit of 2°C (3.6°F), it should not be tantamount to our saying to one of our planetary spacecraft, Bon Voyage, call us when you get to your planetary destination. Rather, as with our spacecraft, the chosen climate-change policy should be monitored throughout the 21st century and Midcourse Corrections made thereto as needed to keep our “Climate Spacecraft” on track to achieve its “Climate Target”.
文摘This study shows that the heretofore assumed condition for no temperature-profile (TP)/lapse-rate feedback, for all altitudes z, or , in fact yields a negative feedback. The correct condition for no TP feedback is for all z, where Ts is the surface temperature. This condition translates into a uniform increase (decrease) in lapse rate with altitude for an increase (decrease) in Ts. The temperature changes caused by a change in solar irradiance and/or planetary albedo satisfy the condition for no TP feedback. The temperature changes caused by a change in greenhouse gas concentration do not satisfy the condition for no TP feedback and, instead, yield a positive feedback.
文摘Measurements show that the Earth’s global-average near-surface temperature has increased by about 0.8℃ since the 19th century. It is critically important to determine whether this global warming is due to natural causes, as contended by climate contrarians, or by human activities, as argued by the Intergovernmental Panel on Climate Change. This study updates our earlier calculations which showed that the observed global warming was predominantly human-caused. Two independent methods are used to analyze the temperature measurements: Singular Spectrum Analysis and Climate Model Simulation. The concurrence of the results of the two methods, each using 13 additional years of temperature measurements from 1998 through 2010, shows that it is humanity, not nature, that has increased the Earth’s global temperature since the 19th century. Humanity is also responsible for the most recent period of warming from 1976 to 2010. Internal climate variability is primarily responsible for the early 20th century warming from 1904 to 1944 and the subsequent cooling from 1944 to 1976. It is also found that the equilibrium climate sensitivity is on the low side of the range given in the IPCC Fourth Assessment Report.
基金supported by a PhD scholarship granted by Fundacao para a Ciencia e a Tecnologia,I.P.(FCT),Portugal,under the PhD Programme FLUVIO–River Restoration and Management,grant number:PD/BD/114558/2016。
文摘Floods are one of nature's most destructive disasters because of the immense damage to land,buildings,and human fatalities.It is difficult to forecast the areas that are vulnerable to flash flooding due to the dynamic and complex nature of the flash floods.Therefore,earlier identification of flash flood susceptible sites can be performed using advanced machine learning models for managing flood disasters.In this study,we applied and assessed two new hybrid ensemble models,namely Dagging and Random Subspace(RS)coupled with Artificial Neural Network(ANN),Random Forest(RF),and Support Vector Machine(SVM)which are the other three state-of-the-art machine learning models for modelling flood susceptibility maps at the Teesta River basin,the northern region of Bangladesh.The application of these models includes twelve flood influencing factors with 413 current and former flooding points,which were transferred in a GIS environment.The information gain ratio,the multicollinearity diagnostics tests were employed to determine the association between the occurrences and flood influential factors.For the validation and the comparison of these models,for the ability to predict the statistical appraisal measures such as Freidman,Wilcoxon signed-rank,and t-paired tests and Receiver Operating Characteristic Curve(ROC)were employed.The value of the Area Under the Curve(AUC)of ROC was above 0.80 for all models.For flood susceptibility modelling,the Dagging model performs superior,followed by RF,the ANN,the SVM,and the RS,then the several benchmark models.The approach and solution-oriented outcomes outlined in this paper will assist state and local authorities as well as policy makers in reducing flood-related threats and will also assist in the implementation of effective mitigation strategies to mitigate future damage.
文摘In our Fair Plan 5 paper, we compared the CO2 emissions of the 80%-Emission-Reduction-By-2050 (80/50) Plan with the CO2 emissions of our Fair Plan to Safeguard Earth’s Climate. We found that the 80/50 Plan reduced CO2 emissions more rapidly than necessary to achieve the principal objective of the Fair Plan: to keep Global Warming (GW) within the 2℃ (3.6℉) limit adopted by the UN Framework Convention on Climate Change (UNFCCC) “to prevent dangerous anthropogenic interference with the climate system”. Here, we ask the “What If” question: “What would the GW of the 80/50 Plan be post 2100 if its CO2 emissions post 2100 were kept at their 2100 value?” We find that although the GW of the 80/50 Plan decreases slightly over part of the 21st century, it does not remain constant thereafter. Rather, the GW of the 80/50 Plan begins to increase in 2088, exceeds that of the Fair Plan beginning in 2230, exceeds the 2℃ (3.6℉) limit of the UNFCCC in 2596, and ends the millennium at 2.7℃ (4.8℉). Thus, not only does the 80/50 Plan phase out humanity’s CO2 emissions faster than necessary to fulfill the UNFCCC constraint, it also fails that constraint if its CO2 emissions post 2100 are kept at their 2100 value. Accordingly, we believe that the Fair Plan to Safeguard Earth’s Climate is superior to the 80/50 Plan.
基金acknowledge the DS&T and NRF(Coal Research Chair Grant Nos.86880)of SA for financing this investigation.
文摘For low-income communities in South Africa,coal is the most common solid fuel which is burnt in a variety of devices,including imbaulas and cast-iron stoves.The present work was conducted with the aim of determining the effect of the fuel particle size on the performance of coal,typically sourced in low-income households in townships in South Africa,and to subsequently compare the performance with a feed char of a common cast iron stove.Four fuel particle sizes of 15,20,30,and 40 mm,as well as a composite of the sizes were tested at 550C,against their untreated coal analogues to evaluate the thermal performance of each fuel.The thermal performance assessment metrics are ignition time,water boiling time,heat transfer and combustion efficiencies,while CO and CO_(2)emissions were measured for the calculations of CO/CO_(2)ratios.Ignition times were found to decrease from coals to chars and to decrease with increasing particle size.The effects of fuel type on the water boiling time were only observed in the later stages of the burn cycle,with the char boiling a 2 L batch of water in an average 24 min,while the coals reported an average boiling time of 20 min.Heat transfer efficiencies showed no significant variation with fuel type or particle size,with the average efficiency for the coals and that of the chars being around 66%.The fuels’performance was better gauged by the combustion efficiency,which was found to improve marginally from the coal fuels to the chars,and to increase with increasing particle size.Results from this testwork could contribute to the performance inventories from the combustion of domestic coal mined in South Africa in a typical cast iron stove which is used in informal settlements.
文摘With the election of Donald Trump as President of the United States of America, it appears likely that the initiation of mitigation of human-caused Global-Warming/Climate-Change will be delayed many years. Accordingly, here we calculate the Emission Phaseout Duration, D = YE - YS, where YS and YE are the Start and End Years of the emissions reduction, for YS = 2020, 2025 and 2030, and maximum Global Warming targets, ΔTmax = 2.0°C, 1.9°C, 1.8°C, 1.7°C, 1.6°C and 1.5°C. The 2.0°C and 1.5°C maxima are the “Hard” and “Aspirational” targets of the 2015 Paris Climate Agreement. We find that D decreases with increasing YS from 2020, and with decreasing ΔTmax. In particular, D decreases from: 1) 76 years for YS = 2020 to 53 years for YS = 2030 for ΔTmax = 2.0°C, and 2) 34 years for YS = 2020 to 7 years for YS = 2030 for ΔTmax = 1.5°C. Thus, delaying the initiation of the phaseout of greenhouse-gas emissions from 2020 to 2030 makes it more difficult to achieve ΔTmax = 2.0°C and impossible to achieve ΔTmax = 1.5°C.
基金funded by the Sultan Mizan Antarctic Research Foundation(YPASM,2017)Malaysiasupported by the Chinese Arctic and Antarctic Administration(CAA)support by Universiti Malaysia Sabah(UMS)。
文摘Surface ozone(O3)is a secondary pollutant harmful to human health and a greenhouse gas which is one of the prime climate forcers.Due to the clean atmospheric environment of the Antarctic region and given the complexity of O3 chemistry,the observation of surface O3 variability in this region is necessary in the quest to better understand the potential sources and sink of polar surface O3.In this paper,we highlighted our observations on O3 variability at the Great Wall Station(GWS)during austral summer in December 2018 and January 2019.The continuous surface O3 measurement at the GWS,Antarctica was carried out using the Ecotech Ozone analyzer.To understand the roles of the meteorological conditions on the temporal variations of O3,meteorological data was obtained from the conventional auto-observational station at the GWS.The Hybrid Single-Particle Lagrangian Integrated Trajectory(HYSPLIT)model was employed to investigate the air mass transport over the region.The observed austral summer surface O3 concentrations at the GWS exhibited variability and were significantly lower than those previously observed at other permanent coastal stations in Antarctica.The surface ozone variability at the GWS was strongly influenced by the synoptic change of air mass origin although the roles of photochemistry production and destruction were still uncertain.Marine characteristics and stable surface O3 characterized the air masses that reached the GWS.The unique characteristic of surface O3 at the coastal site of GWS was emphasized by its synoptic air mass characteristics,which displayed a significant influence on surface O3 variability.Air mass that traveled over the ocean with relatively shorter distance was linked to the lower O3 level,whereby the marine transport of reactive bromine(Br)species was thought to play a significant role in the tropospheric chemistry that leads to O3 destruction.Meanwhile,the diurnal variation indicated that the O3 background concentration levels were not strongly associated with the local atmospheric conditions.
文摘The most recent US Congressional climate bill, H.R.5271 in 2014, proposes to reduce US emissions of carbon dioxide relative to their 2005 value by 80% in 2050. This bill does not provide a rationale for this rapid phase down of CO2 emissions. In 2012, we crafted a Fair Plan to Safeguard Earth’s Climate such that: 1) The cumulative trade-adjusted CO2 emissions by the developing countries equal the cumulative trade-adjusted CO2 emissions by the developed countries;2) The maximum global warming above preindustrial temperature does not exceed the 2°C (3.6°F) chosen by the United Nations Framework Convention on Climate Change “to prevent dangerous anthropogenic interference with the climate system”;and 3) The phase out of CO2 emissions begins as late as possible in the 21st century and proceeds at the slowest possible pace, consistent with objectives 1 and 2. The Fair Plan begins in 2020 and reduces the world’s emissions to zero in 2100. In the Fair Plan the emissions of the developed countries, including the United States, reach 80% below their 2005 values in 2094, that is, 44 years later than proposed by H.R.5271. While it is imperative that humanity begins to wean itself from fossil fuels no later than 2020, the transition from fossil to non-fossil energy need not be completed before 2100 if all countries follow their Fair Plan trajectories.
文摘The nonlinearity of the strain energy at an interval period of applying seismic load on the geostructures makes it difficult for a seismic designer to makes appropriate engineering judgments timely.The nonlinear stress and strain analysis of an embankment is needed to evaluate by using a combination of suitable methods.In this study,a large-scale geostructure was seismically simulated and analyzed using the nonlinear finite element method(NFEM),and linear regression method which is a soft computing technique(SC)was applied for evaluating the results of NFEM,and it supports engineering judgment because the design of the geostructures is usually considered to be an inaccurate process owing to high nonlinearity of the large-scale geostructures seismic response and such nonlinearity may induce the complexity for decision making in geostructures seismic design.The occurrence of nonlinear stress and nonlinear strain probability distribution can be observed and density of stress and strain are predicted by using the histogram.The results of both the simulation from the NFEM and the linear regression method confirm the nonlinearity of strain energy and stress behavior have a close value of R2 and root-mean-square error(RMSE).The linear regression and histogram simulation shows the accuracy of NFEM results.The outcome of this study guides to improve engineering judgment quality for seismic analysis of an embankment through validating results of NFEM by employing appropriate soft computing techniques.
文摘Previously we have used Singular Spectrum Analysis (SSA) to deconstruct the global-mean near-surface temperature observations of the Hadley Centre—Climate Research Unit that extend from 1850 through 2012. While SSA is a very powerful tool, it is rather like a statistical “black box” that gives little intuition about its results. Accordingly, here we use the simplest statistical tool to provide such intuition, the Simple Moving Average (SMA). Firstly we use a 21-year SMA. This reveals a nonlinear trend and an oscillation of about 60 years' length. Secondly we use a 61-year SMA on the raw observations. This yields a nonlinear trend. We subtract this trend from the raw observations and apply a 21-year SMA. This yields a Quasi-periodic Oscillation (QPO) with a period and amplitude of about 62.4 years and 0.11°C. This is the QPO we discovered in our 1994 Nature paper, which has come to be called the Atlantic Multidecadal Oscillation. We then subtract QPO-1 from the detrended observations and apply an 11-year SMA. This yields QPO-2 with a period and amplitude of about 21.0 years and 0.04°C. We subtract QPO-2 from the detrended observations minus QPO-1 and apply a 3-year SMA. This yields QPO-3 with a period and amplitude of about 9.1 years and 0.03°C. QPOs 1, 2 and 3 are sufficiently regular in period and amplitude that we fit them by sine waves, thereby yielding the above periods and amplitudes. We then subtract QPO-3 from the detrended observations minus QPOs 1 and 2. The result is too irregular in period and amplitude to be fit by a sine wave. Accordingly we represent this unpredictable part of the temperature observations by a Gaussian probability distribution (GPD) with a mean of zero and standard deviation of 0.08°C. The sum of QPOs 1, 2 and 3 plus the GPD can be used to project the natural variability of the global-mean near-surface temperature to add to, and be compared with, the continuing temperature trend caused predominantly by humanity’s continuing combustion of fossil fuels.
基金the DS&T and NRF of SA(Coal Research Chair Grant No.86880)for funding this project.
文摘The coal combustion in cast-iron stoves leads to health hazards and air pollution.In this study the CO,SO2,NOx,PM and VOC emission concentrations were measured whilst combusting four fuel particle sizes(15,20,30,and 40 mm)as well as a composite of the sizes(all pre-devolatilized at a temperature of 550C)in a cast-iron stove.The results were compared to their raw coal analogues to evaluate the emission performance of each fuel type.Emission factors for NOx and SO2 were found to depend on the fuel nitrogen and sulphur contents in the coal and the combustion conditions used during pyrolysis.The PM,SO2 and VOC emissions show a strong dependence on the ash percentage and volatile matter yields,which both increased with increasing particle size.In addition,the PM,SO2 and VOC missions were found to only depend on particle size on a mechanistic level.The VOCs and PM emission factors are inversely correlated with particle size.The results from this study offer insight into the combustion environment in the Falkirk Union No 7 cast-iron stove as well as how this environment applies to low smoke fuels.The work contributes to the emission and performance inventories from South African domestic coal combustion in this stove used in informal settlements.
基金supported by the Greek General Secretariat of Research and Technology through ESPA-KRIPIS Project: "Development of an integrated management framework of river basins and associated coastal and marine zone"
文摘This study reports on the first investigation into the potential of luminescence dating to establish a chronological framework for the depositional sequences of the Sperchios delta plain, central Greece. A series of three borehole cores(20 m deep) and two shallow cores(4 m deep), from across the delta plain, were extracted, and samples were collected for luminescence dating. The luminescence ages of sand-sized quartz grains were obtained from small aliquots of quartz, using the Single-Aliquot Regenerative-dose(SAR) protocol.The equivalent dose determination included a series of tests and the selection of the Minimum Age Model(MAM) as the most appropriate statistical model. This made it possible to confirm the applicability of quartz Optically Stimulated Luminescence(OSL) dating to establish absolute chronology for deltaic sediments from the Sperchios delta plain.Testing age results of the five cores showed that the deltaic sediments were deposited during the Holocene.A relatively rapid deposition is implied for the top14 m possibly as a result of the deceleration in the rate of the sea-level rise and the transition to terrestrial conditions, while on the deeper parts, the reduced sedimentation rate may indicate a lagoonal or coastal environment.
文摘A sequence of ground-based radar reflectivity images sampled in the 17 hours prior to, and during the landfall of Severe Tropical Cyclone Larry(2006) are presented and analyzed using Fourier and wavelet analysis techniques. A range of mesoscale convective anomalies were detected, with characteristics and behavior consistent with vortex Rossby wave initiation. Cyclonically propagating eye-wall kinks, elongations and mesoscale reflectivity maxima were all observed throughout the sampling period, along with intense inner spiral bands. Various deep convective maxima propagated within the eye-wall at speeds consistent with predictions derived by linear barotropic wave theory. Three eye-wall breakdown episodes were observed during the study period, along with corresponding increases in storm-core asymmetric wave power and reductions in estimated storm intensity. Vortex Rossby wave initiated radial flows are also suggested by the presence of a possible mesovortex within a broken section of the eye-wall during landfall, and the outward ejection of filaments of deep convection from an adjacent inner spiral band. The possible influence of this wave activity upon the storm intensity and integrity is discussed.