meta data for this page
Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
applications:applications [2022/03/07 22:46] – joetulenko | applications:applications [2023/02/16 05:11] (current) – gregbalco | ||
---|---|---|---|
Line 2: | Line 2: | ||
Here are some examples we have come up with so far. Please email any of us on the project [[balcs@bgc.org|Greg Balco]], [[benjamin.laabs@ndsu.edu|Ben Laabs]], and/or [[jtulenko@bgc.org|Joe Tulenko]] with your ideas so we can add them to the list! | Here are some examples we have come up with so far. Please email any of us on the project [[balcs@bgc.org|Greg Balco]], [[benjamin.laabs@ndsu.edu|Ben Laabs]], and/or [[jtulenko@bgc.org|Joe Tulenko]] with your ideas so we can add them to the list! | ||
- | |||
- | **1) Analysis layer examples: the ICE-D X OCTOPUS web application** | ||
- | The [[http:// | ||
- | |||
- | The source code for the ICE-D X OCTOPUS web app can be viewed [[https:// | ||
---- | ---- | ||
- | **2) Data-model comparison between LGM and penultimate moraine ages, and model output from simulations over multiple glaciations** (ie the ice sheet influence on regional climate example). **Hypothesis: | + | **1) Data-model comparison between LGM and penultimate moraine ages, and model output from simulations over multiple glaciations** (ie the ice sheet influence on regional climate example). **Hypothesis: |
See the example output figure below and {{ : | See the example output figure below and {{ : | ||
Line 18: | Line 13: | ||
---- | ---- | ||
- | **3) Testing global expression of Younger Dryas** | + | **2) Testing global expression of Younger Dryas** |
Please find a tutorial and some matlab scripts used to generate some of the plots found in a recent paper from Greg Balco ([[https:// | Please find a tutorial and some matlab scripts used to generate some of the plots found in a recent paper from Greg Balco ([[https:// | ||
Line 28: | Line 23: | ||
---- | ---- | ||
- | **4) Post-Glacial Greenland ice-sheet retreat time-distance diagram** following up on a workshop at the University at Buffalo, we attempted | + | **3) Post-Glacial Greenland ice-sheet retreat time-distance diagram** following up on a workshop at the University at Buffalo, we attempt |
+ | |||
+ | {{: | ||
+ | |||
+ | |||
+ | The script combs through the new database and selects all ages from ' | ||
+ | |||
+ | < | ||
+ | |||
+ | % Does deglaciation TDD for west Greenland.... | ||
+ | clear all; close all; | ||
+ | |||
+ | %% For MAC users use this set of code to connect to the database | ||
+ | % First set up SSH tunnel on port 12345. | ||
+ | %dbc = database(' | ||
+ | |||
+ | %% For WINDOWS users use this set of code to connect to the database | ||
+ | dbc = database(' | ||
+ | |||
+ | %% get all samples | ||
+ | |||
+ | q1 = [' | ||
+ | 'from iced.base_sample, | ||
+ | 'where iced.base_sample.site_id = iced.base_application_sites.site_id ' ... | ||
+ | 'and iced.base_application_sites.application_id = iced.base_application.id ' ... | ||
+ | 'and iced.base_application.Name = " | ||
+ | |||
+ | result1 = fetch(dbc, | ||
+ | |||
+ | %% get West Greenland samples | ||
+ | |||
+ | blats = [64.8 71]; | ||
+ | %blats = [61 71]; | ||
+ | blons = [-60 -48]; | ||
+ | |||
+ | q2 = [' | ||
+ | ' from iced.base_sample, | ||
+ | ' where iced.base_sample.lat_DD < ' sprintf(' | ||
+ | ' and iced.base_sample.lon_DD < ' sprintf(' | ||
+ | ' and iced.base_sample.lat_DD > ' sprintf(' | ||
+ | ' and iced.base_sample.lon_DD > ' sprintf(' | ||
+ | ' and iced.base_sample.site_id = iced.base_application_sites.site_id ' ... | ||
+ | ' and iced.base_application_sites.application_id = iced.base_application.id ' ... | ||
+ | ' and iced.base_application.Name = " | ||
+ | |||
+ | result2 = fetch(dbc, | ||
+ | |||
+ | locs1 = cell2mat(table2cell(result1)); | ||
+ | locs2 = cell2mat(table2cell(result2)); | ||
+ | |||
+ | %% get age data from everything | ||
+ | |||
+ | q3 = [' | ||
+ | ' | ||
+ | ' | ||
+ | 'from iced.base_sample, | ||
+ | ' where iced.base_sample.id = iced.base_calculatedages.sample_unique_id ' ... | ||
+ | ' and iced.base_sample.site_id = iced.base_application_sites.site_id ' ... | ||
+ | ' and iced.base_application_sites.application_id = iced.base_application.id ' ... | ||
+ | ' and iced.base_application.Name = " | ||
+ | ' and lat_DD < ' sprintf(' | ||
+ | |||
+ | result3 = fetch(dbc, | ||
+ | |||
+ | %% | ||
+ | |||
+ | ages1 = cell2mat(table2cell(result3)); | ||
+ | |||
+ | % And just from boulders | ||
+ | q4 = [q3 ' and iced.base_sample.what like " | ||
+ | |||
+ | result4 = fetch(dbc, | ||
+ | |||
+ | ages2 = cell2mat(table2cell(result4)); | ||
+ | |||
+ | close(dbc); | ||
+ | |||
+ | %% | ||
+ | |||
+ | figure(1); clf; | ||
+ | % Center on Greenland Summit | ||
+ | gisplat = 72 + 36/60; | ||
+ | gisplon = -38.5; | ||
+ | |||
+ | xl = [-0.18 0.15]; | ||
+ | yl = [-0.25 0.25]; | ||
+ | |||
+ | tx = xl(1) + 0.95*diff(xl); | ||
+ | ty = yl(1) + 0.06*diff(yl); | ||
+ | ts = 14; | ||
+ | |||
+ | mw = 0.005; | ||
+ | awx = (1 - 5*mw)/4; | ||
+ | awy = (1-2*mw); | ||
+ | |||
+ | axesm(' | ||
+ | |||
+ | %axesm(' | ||
+ | |||
+ | |||
+ | g1 = geoshow(' | ||
+ | set(get(g1,' | ||
+ | set(gca,' | ||
+ | set(gca,' | ||
+ | hold on; | ||
+ | |||
+ | |||
+ | % Plot all samples | ||
+ | plotm(locs1(:, | ||
+ | % Plot only samples in west greenland selection box | ||
+ | plotm(locs2(:, | ||
+ | |||
+ | |||
+ | plotm(blats([1 1 2 2 1]), | ||
+ | |||
+ | |||
+ | |||
+ | % | ||
+ | |||
+ | set(gca,' | ||
+ | %plotm(gisplat, | ||
+ | %text(tx, | ||
+ | |||
+ | hg = gridm(' | ||
+ | set(hg,' | ||
+ | set(hg,' | ||
+ | |||
+ | temp = jet(12); | ||
+ | for a = 1:12 | ||
+ | maxage = 11000 - (a-1).*500; | ||
+ | minage = maxage - 500; | ||
+ | these = find(ages2(:, | ||
+ | if length(these) > 0 | ||
+ | plotm(ages2(these, | ||
+ | end | ||
+ | end | ||
+ | |||
+ | |||
+ | |||
+ | |||
+ | |||
+ | %% Plot | ||
+ | |||
+ | figure(2); clf; | ||
+ | %plot(ages1(:, | ||
+ | hold on; | ||
+ | |||
+ | for a = 1: | ||
+ | xx = [1 1].*ages2(a, | ||
+ | yy = ages2(a,3) + [-1 1].*ages2(a, | ||
+ | plot(xx, | ||
+ | end | ||
+ | |||
+ | plot(ages2(:, | ||
+ | set(gca,' | ||
+ | |||
+ | %% Plot proposed cold periods from Young and others, etc. | ||
+ | |||
+ | hold on; | ||
+ | |||
+ | events = [11620 10410 9090 8050 7300]; | ||
+ | devents = [430 350 260 220 310]; | ||
+ | |||
+ | for a = 1: | ||
+ | xx = [-54 -49]; yy = [events(a) events(a)]; | ||
+ | plot(xx, | ||
+ | xx = [-54 -49 -49 -54 -54]; | ||
+ | yy = [events(a)-devents(a) events(a)-devents(a) events(a)+devents(a) events(a)+devents(a) events(a)-devents(a)]; | ||
+ | patch(xx, | ||
+ | end | ||
+ | |||
+ | %% filter for better AEP performance | ||
+ | p1 = polyfit([-53.5 -49.88], | ||
+ | px = [-54 -49]; py = polyval(p1, | ||
+ | |||
+ | predt = polyval(p1, | ||
+ | okclip = find(abs(predt - ages2(:,3)) < 1000); | ||
+ | plot(ages2(okclip, | ||
+ | |||
+ | aept = ages2(okclip, | ||
+ | aepdt = ages2(okclip, | ||
+ | aepz = -ages2(okclip, | ||
+ | |||
+ | % Get bounding curve | ||
+ | |||
+ | out = aep_bound(aept, | ||
+ | figure(2); | ||
+ | aept_out = out.t; aepx_out = -(out.z + 4900)./ | ||
+ | plot(aepx_out, | ||
+ | |||
+ | %% MCS bounding curve | ||
+ | |||
+ | % Interestingly, | ||
+ | % there are a lot of data, almost all the random iterations have one | ||
+ | % outlier that pulls it out. So it actually doesn' | ||
+ | |||
+ | if 1 | ||
+ | |||
+ | figure; | ||
+ | plot(aept, | ||
+ | p1 = plot(1000, | ||
+ | ni = 100; | ||
+ | intt = 6500: | ||
+ | for a = 1:ni | ||
+ | thist = randn(size(aept)).*aepdt + aept; | ||
+ | thisz = aepz; | ||
+ | out = aep_bound(thist, | ||
+ | delete(p1); | ||
+ | p1 = plot(thist, | ||
+ | plot(out.t, | ||
+ | %inty(a,:) = interp1(out.t, | ||
+ | %plot(intt, | ||
+ | disp(a); | ||
+ | end | ||
+ | |||
+ | end | ||
+ | |||
+ | %% | ||
+ | |||
+ | if 1 | ||
+ | |||
+ | % Make figure comparing this to Holocene " | ||
+ | |||
+ | figure; | ||
+ | diffx = diff(aepx_out)./ | ||
+ | for a = 1: | ||
+ | xx = [aept_out(a) aept_out(a+1) aept_out(a+1)]; | ||
+ | if a == length(diffx) | ||
+ | yy = -[diffx(a) diffx(a) diffx(a)]; | ||
+ | else | ||
+ | yy = -[diffx(a) diffx(a) diffx(a+1)]; | ||
+ | end | ||
+ | plot(xx, | ||
+ | end | ||
+ | |||
+ | |||
+ | set(gca,' | ||
+ | grid on; | ||
+ | for a = 1: | ||
+ | yy = [0 5e-3]; xx = [events(a) events(a)]; | ||
+ | plot(xx, | ||
+ | yy = [0 5e-3 5e-3 0 0]; | ||
+ | xx = [events(a)-devents(a) events(a)-devents(a) events(a)+devents(a) events(a)+devents(a) events(a)-devents(a)]; | ||
+ | patch(xx, | ||
+ | end | ||
+ | |||
+ | end | ||
+ | |||
+ | </ | ||
---- | ---- | ||
- | **5) Determining if measurement precision has gotten better through time** | + | **4) Determining if measurement precision has gotten better through time** |
This is a somewhat simple and fun exercise to investigate whether or not we as a community have been making progressively better measurements (ie improvements to field sample techniques, lab extraction procedures, AMS measurements, | This is a somewhat simple and fun exercise to investigate whether or not we as a community have been making progressively better measurements (ie improvements to field sample techniques, lab extraction procedures, AMS measurements, | ||
See the summary plots below that show the story is a bit more complicated; | See the summary plots below that show the story is a bit more complicated; | ||
+ | **Plot One** | ||
+ | This is what you get when you plot measurement errors (%) for all samples in the database (with sample collection dates) against their sample collection dates. There is a linear trend fitted to the data with a very poor R^2 value. | ||
+ | {{ : | ||
- | **Follow up** as it should be obvious from this follow up plot, there is a strong | + | **Plot 2** |
+ | As it should be obvious from the plot, there is a notable power relationship between sample concentration and percent error. This should make sense because the higher the concentration of Be-10 in the sample, the smaller % of that is from background Be-10 (ie Be-10 that was not originally trapped within sample quartz grains but was rather either from meteoric Be-10 that contaminated samples, Boron contamination that has the same atomic mass as Be-10, and/or Be-10 that came with the Be-9 carrier). | ||
+ | |||
+ | {{ : | ||
+ | |||
+ | **Plot 3** | ||
+ | So what we need to do is detrend | ||
+ | |||
+ | The final plot below shows what happens when you plot detrended measurement errors against sample collection date. Notice that the R^2 value is still relatively low but did increase by a factor of 4 and the slope became slightly more negative. Maybe this very loose correlation does say something about community improvement... What do you think? | ||
+ | |||
+ | {{ : | ||
Line 48: | Line 304: | ||
< | < | ||
- | what up | + | %%%% A matlab script for extracting all 10Be measurements from the alpine/ |
+ | %%%% and plotting their % measurement uncertainty against collection date | ||
+ | %%%% to see if we collectively as a community have gotten any better over the years | ||
+ | %%%% at making these measurements. Largest improvements I would expect | ||
+ | %%%% should come from laboratory procedures (ie lower level blanks, better | ||
+ | %%%% 10Be isolation techniques, AMS precision improvement (maybe), etc. | ||
+ | |||
+ | %clear out all pre-existing variables and windows to start fresh | ||
+ | clear all; close all; | ||
+ | |||
+ | % as always, one must first connect to ICE-D. | ||
+ | % Before running the script, be sure that you are connected to ICE-D in | ||
+ | % Heidi SQL! | ||
+ | % The script for connecting from a windows OS is as follows, | ||
+ | % assuming you set up an ODBC connection: | ||
+ | |||
+ | dbc = database(' | ||
+ | |||
+ | % here is some code from Greg Balco for connecting to the database from a | ||
+ | % Mac computer. I have no idea if it actually works but here it is for | ||
+ | % y' | ||
+ | |||
+ | % Get Sequel Pro running and use | ||
+ | % the below to find the SSH tunnel port. | ||
+ | |||
+ | %[sys1, | ||
+ | %portindex = strfind(sys2,': | ||
+ | %portstr = sys2((portindex(1)-5): | ||
+ | |||
+ | |||
+ | %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% | ||
+ | |||
+ | % Once connected to ICE-D, we want to extract ALL samples from the database | ||
+ | % that have sample collection date information found in the ' | ||
+ | % here is an SQL query formatted in matlab to extract all samples and | ||
+ | % format the sample names, date collected, concentration of 10Be atoms and | ||
+ | % AMS measurement error into one table: | ||
+ | |||
+ | %% Query to extract all 10Be data from the alpine database | ||
+ | |||
+ | q1 = [' | ||
+ | ' from iced.base_sample, | ||
+ | ' where iced.base_sample.id = iced._be10_al26_quartz.sample_id ' ... | ||
+ | ' and iced.base_sample.id = iced.base_calculatedages.sample_unique_id ' ... | ||
+ | ' and iced.base_sample.site_id = iced.base_site.id ' ... | ||
+ | ' and iced.base_site.id = iced.base_application_sites.site_id ' ... | ||
+ | ' and iced.base_sample.date_collected is not null ' ... | ||
+ | ' and iced._be10_al26_quartz.N10_atoms_g is not null ' ... | ||
+ | ' and iced._be10_al26_quartz.delN10_atoms_g is not null ' ... | ||
+ | ' and iced.base_sample.date_collected not like " | ||
+ | ' and iced.base_sample.date_collected not like " | ||
+ | ' and iced.base_application_sites.application_id = 2 ']; | ||
+ | |||
+ | %% If you would like to try the entire database (not just the alpine database) use this query instead of the first one | ||
+ | |||
+ | %q1 = [' | ||
+ | % ' from iced.base_sample, | ||
+ | % ' where iced.base_sample.id = iced._be10_al26_quartz.sample_id ' ... | ||
+ | % ' and iced.base_sample.id = iced.base_calculatedages.sample_unique_id ' ... | ||
+ | % ' and iced.base_sample.date_collected is not null ' ... | ||
+ | % ' and iced._be10_al26_quartz.N10_atoms_g is not null ' ... | ||
+ | % ' and iced._be10_al26_quartz.delN10_atoms_g is not null ' ... | ||
+ | % ' and iced.base_calculatedages.t_LSDn not like " | ||
+ | % ' and iced.base_sample.date_collected not like " | ||
+ | % ' and iced.base_sample.date_collected not like " | ||
+ | |||
+ | %% gather the data and organize it into cell arrays that can be used for plotting | ||
+ | |||
+ | % now, we want to store these selections as a table in matlab | ||
+ | % and convert the table into column vectors for plotting. | ||
+ | % this is made somewhat complicated by the fact that the dates extracted | ||
+ | % from ICE-D are in string format so you have to break them up. | ||
+ | |||
+ | samples.table = fetch(dbc, | ||
+ | samples.date_cell = table2array(samples.table(:, | ||
+ | samples.date_cell_split = split(samples.date_cell, | ||
+ | samples.date_raw = str2double(samples.date_cell_split); | ||
+ | samples.date_final = double(samples.date_raw(:, | ||
+ | samples.conc = table2array(samples.table(:, | ||
+ | samples.err = table2array(samples.table(:, | ||
+ | samples.errpercent = double(((samples.err(:, | ||
+ | samples.age = table2array(samples.table(:, | ||
+ | |||
+ | |||
+ | % most important variables we created here are the samples.errpercent | ||
+ | % (sample error divided by sample concentration * 100), the final date | ||
+ | % collected (year + month/12 + day/30) and the age of each sample to make | ||
+ | % a scatterplot of percent error versus time (with marker size dictated by age) | ||
+ | |||
+ | %% Figure 1 - percent error by date measured without detrending the influence of sample concentration | ||
+ | |||
+ | fig1 = figure(1) | ||
+ | scatter(samples.date_final, | ||
+ | ' | ||
+ | ' | ||
+ | ylim([0 100]) | ||
+ | set(gca, ' | ||
+ | ylabel(' | ||
+ | xlabel(' | ||
+ | dim1 = [.15 .9 0 0]; | ||
+ | str1 = {[' | ||
+ | ['R^2 = 0.0003' | ||
+ | annotation(' | ||
+ | hold on | ||
+ | x1 = samples.date_final; | ||
+ | y1 = -0.0669 * x1 + 140.15; | ||
+ | plot(x1,y1, " | ||
+ | |||
+ | % and that is the simple approach but as we know, there is a dependence | ||
+ | % of precision on sample concentration (ie higher concentration = generally | ||
+ | % lower error since the background Be is less impactful for hot samples. | ||
+ | % to get around this, let's weigh each sample based on this relationship | ||
+ | % (see the script below and the resulting figure that demonstrates this | ||
+ | % what-appears-to-be a power relationship. | ||
+ | |||
+ | %% Figure 2 establishing the relationship between sample concentration and percent error | ||
+ | |||
+ | fig2 = figure(2) | ||
+ | scatter(samples.conc, | ||
+ | ' | ||
+ | ' | ||
+ | ylim([0 100]) | ||
+ | set(gca, ' | ||
+ | ylabel(' | ||
+ | xlabel(' | ||
+ | dim2 = [.15 .9 0 0]; | ||
+ | str2 = {[' | ||
+ | ['R^2 = 0.1005' | ||
+ | annotation(' | ||
+ | hold on | ||
+ | x2= samples.conc; | ||
+ | y2=65.185 * x2.^-0.216; | ||
+ | plot(x2,y2, " | ||
+ | |||
+ | |||
+ | % so now, let's plot percent errors weighted by the equation fit to | ||
+ | % the precision vs concentration plot above. Simplest way to do this is to | ||
+ | %detrend the data (ie generate a curve of expected values given the | ||
+ | %distribution - the fitted power curve - and subtract the expected values | ||
+ | %from the actual measurements. Then, plot the difference against the | ||
+ | %measurement date to remove the influence of sample concentration on error | ||
+ | %percent since we only want to know if error percent is decreasing over | ||
+ | %time. | ||
+ | |||
+ | %the way I have been thinking of it is like this: if we made the same | ||
+ | %measurement on the same exact sample every time (ie the expected | ||
+ | %concentration should be the same every time) for the past 30 years, and if | ||
+ | %we actually have been making better measurements, | ||
+ | %decreasing trend in percent error over time (even though it is the exact | ||
+ | %same sample that we are measuring every time). Obviously this is not true, | ||
+ | %we've been making all sorts of measurements over the past 30 years and so | ||
+ | %this is an attempt to detrend the data in concentration vs percent error | ||
+ | %space and plot the detrended data against date of measurement. Anyway... | ||
+ | |||
+ | %% Figure 3 plotting percent error by date measured after accounting for influence of concentration on percent error | ||
+ | |||
+ | fig3 = figure(3) | ||
+ | y3 = samples.errpercent - y2; | ||
+ | scatter(samples.date_final, | ||
+ | ' | ||
+ | ' | ||
+ | ylim([-10 10]) | ||
+ | ylabel(' | ||
+ | xlabel(' | ||
+ | dim3 = [0.15 .9 0 0]; | ||
+ | str3 = {[' | ||
+ | ['R^2 = 0.0012' | ||
+ | annotation(' | ||
+ | hold on | ||
+ | x3= samples.date_final; | ||
+ | y3=-0.1289 * x3 + 260.3; | ||
+ | plot(x3,y3, " | ||
</ | </ | ||
Line 56: | Line 483: | ||
---- | ---- | ||
- | **6) Is there a correlation between Al/Be ratios and sample elevation? | + | **5) Is there a correlation between Al/Be ratios and sample elevation? |
This example is based off a recent publication ([[https:// | This example is based off a recent publication ([[https:// | ||
Line 81: | Line 508: | ||
% assuming you set up an ODBC connection: | % assuming you set up an ODBC connection: | ||
- | dbc = database(' | + | dbc = database(' |
% here is some code from Greg Balco for connecting to the database from a | % here is some code from Greg Balco for connecting to the database from a | ||
Line 105: | Line 532: | ||
% dictating marker size). | % dictating marker size). | ||
- | q1 = [' | + | %% Query to extract all 10Be Al26 pairs from the alpine database |
- | ' Be10_Al26_quartz.delN26_atoms_g, | + | |
- | ' | + | |
- | ' where samples.sample_name = Be10_Al26_quartz.sample_name '... | + | |
- | ' and samples.sample_name = calculated_ages.sample_name '... | + | |
- | ' and samples.elv_m is not null and Be10_Al26_quartz.N10_atoms_g is not null and Be10_Al26_quartz.N26_atoms_g is not null '... | + | |
- | ' and Be10_Al26_quartz.N26_atoms_g > 1000 and Be10_Al26_quartz.N26_atoms_g not like 0 ']; | + | |
- | % this will store all data in a table that we can format | + | q1 = [' |
- | % for plotting | + | ' iced.base_sample.elv_m, |
+ | ' iced._be10_al26_quartz.N10_atoms_g, | ||
+ | ' iced._be10_al26_quartz.delN10_atoms_g, | ||
+ | ' iced._be10_al26_quartz.N26_atoms_g, | ||
+ | ' iced._be10_al26_quartz.delN26_atoms_g, | ||
+ | ' iced.base_calculatedages.t_LSDn ' ... | ||
+ | ' from iced.base_sample, | ||
+ | ' where iced.base_sample.id = iced._be10_al26_quartz.sample_id ' ... | ||
+ | ' and iced.base_sample.id = iced.base_calculatedages.sample_unique_id ' ... | ||
+ | ' AND iced.base_sample.site_id = iced.base_site.id ' ... | ||
+ | ' and iced.base_site.id = iced.base_application_sites.site_id ' ... | ||
+ | ' and iced.base_sample.elv_m is not NULL ' ... | ||
+ | ' and iced._be10_al26_quartz.N10_atoms_g is not NULL ' ... | ||
+ | ' and iced._be10_al26_quartz.N26_atoms_g is not null ' ... | ||
+ | ' and iced._be10_al26_quartz.N26_atoms_g > 1000 ' ... | ||
+ | ' and iced._be10_al26_quartz.N26_atoms_g not like 0 ' ... | ||
+ | ' and iced.base_application_sites.application_id = 2 ']; | ||
+ | |||
+ | %% If you would like to try the entire database (not just the alpine database) use this query instead of the first one | ||
+ | % Worth noting that in this query, we have not isolated the samples that | ||
+ | % have strictly simple exposure histories so there are probably | ||
+ | % samples in the dataset | ||
+ | |||
+ | %q1 = [' | ||
+ | %' iced.base_sample.elv_m, | ||
+ | %' iced._be10_al26_quartz.N10_atoms_g, | ||
+ | %' iced._be10_al26_quartz.delN10_atoms_g, | ||
+ | %' iced._be10_al26_quartz.N26_atoms_g, | ||
+ | %' iced._be10_al26_quartz.delN26_atoms_g, | ||
+ | %' iced.base_calculatedages.t_LSDn ' ... | ||
+ | %' from iced.base_sample, | ||
+ | %' where iced.base_sample.id = iced._be10_al26_quartz.sample_id ' ... | ||
+ | %' and iced.base_sample.id = iced.base_calculatedages.sample_unique_id ' ... | ||
+ | %' and iced.base_sample.elv_m is not NULL ' ... | ||
+ | %' and iced._be10_al26_quartz.N10_atoms_g is not NULL ' ... | ||
+ | %' and iced._be10_al26_quartz.N26_atoms_g is not null ' ... | ||
+ | %' and iced._be10_al26_quartz.N26_atoms_g > 1000 ' ... | ||
+ | %' and iced._be10_al26_quartz.N26_atoms_g not like 0 ' ... | ||
+ | % and iced.base_calculatedages.t_LSDn not like 0 ' ... | ||
+ | %' AND iced.base_sample.name NOT LIKE " | ||
+ | |||
+ | %% gather the data and organize it into cell arrays | ||
samples.table = fetch(dbc, | samples.table = fetch(dbc, | ||
Line 125: | Line 587: | ||
samples.error = double((samples.cell_array(:, | samples.error = double((samples.cell_array(:, | ||
- | %now time to plot it all! | + | %% The code to maka a figure plotting up all of the samples by elevation vs Al/Be ratio |
+ | fig1 = figure(1) | ||
errorbar(samples.cell_array(:, | errorbar(samples.cell_array(:, | ||
hold on | hold on | ||
Line 133: | Line 596: | ||
' | ' | ||
' | ' | ||
+ | ylim([0 15]) | ||
xlabel(' | xlabel(' | ||
ylabel(' | ylabel(' | ||
+ | dim1 = [.8 .9 0 0]; | ||
+ | str1 = {[' | ||
+ | ['R^2 = 0.0375' | ||
+ | annotation(' | ||
hold on | hold on | ||
- | + | x1 = samples.cell_array(: | |
- | % from here, one could do some regressions, fit a line to the data and one | + | y1=-.0002 * x1 + 6.4274; |
- | % might find a negative correltaion between Al/Be ratios and elevation. | + | plot(x1,y1, " |
- | + | ||
- | % At the moment I just added this lsline function in the Statistics and | + | |
- | % Machine Learning toolbox that fits a linear regression to the data. Still figuring out how to do more statistics on | + | |
- | % it (ie confidence interval, displaying regression equation and r2, etc.) | + | |
- | + | ||
- | lsline | + | |
%thanks for following along! | %thanks for following along! | ||
Line 152: | Line 614: | ||
---- | ---- | ||
- | **7) Heinrich Stadials aridity drives glacier retreat in the Mediterranean? | + | **6) Heinrich Stadials aridity drives glacier retreat in the Mediterranean? |
This example is a follow up to a paper recently published in Nature Geoscience ([[https:// | This example is a follow up to a paper recently published in Nature Geoscience ([[https:// | ||
Line 163: | Line 625: | ||
---- | ---- | ||
- | **8) Identifying regions of possible heavy moraine degradation** (using the moraine ages and land degradation models incorporated into the middle layer of calculations) and **comparing identified areas of high degradation to geohazards** (plate boundaries and areas of high seismic activity). | + | **7) Identifying regions of possible heavy moraine degradation** (using the moraine ages and land degradation models incorporated into the middle layer of calculations) and **comparing identified areas of high degradation to geohazards** (plate boundaries and areas of high seismic activity). |
**Hypothesis: | **Hypothesis: |