WEBVTT 00:00:03.000 --> 00:00:13.000 Greetings! Come, everybody, join virtual hands as we join a community of community modelers, led by Jess and Andreas. Please take it away 00:00:13.000 --> 00:00:18.000 Jess and Andreas. 00:00:18.000 --> 00:00:25.000 Okay. Well, welcome back everybody to our next session, entitled, "Modeling is better with friends." 00:00:25.000 --> 00:00:46.000 Update on California community models and we'll have a series of talks about a variety of fairly large-scale studies and community models in Central and Northern California, including the discussion of fault creep, earthquake monitoring and seismic velocity models. So I 00:00:46.000 --> 00:00:53.000 just wanted to reiterate for those who are just tuning in, or may have forgotten that 00:00:53.000 --> 00:00:54.000 we'll have our series of five talks, and then we'll have a 30 min discussion at the end. 00:00:54.000 --> 00:01:05.000 We'd like for people to use the chat during the talk primarily for questions related to clarification that the speaker can answer while the talk is playing. 00:01:05.000 --> 00:01:14.000 But please plan to save your more in-depth questions for the Q & A 00:01:14.000 --> 00:01:15.000 session so that the chat doesn't distract people from actually listening to the talks. 00:01:15.000 --> 00:01:25.000 So with that I'll hand it over to Andreas who will be introducing the individual talks. 00:01:25.000 --> 00:01:29.000 Yes, thanks. We'll take care of 00:01:29.000 --> 00:01:36.000 any questions in the talk and the chat will also be mentioned in the discussion session. 00:01:36.000 --> 00:01:46.000 So the first talk here is by Josie Nevitt and Austin Elliott from USGS 00:01:46.000 --> 00:01:50.000 on, "Shallow fault, creep within the Bay Area fault system 00:01:50.000 --> 00:01:55.000 from near-field observations." 00:01:55.000 --> 00:01:58.000 Many faults in northern California creep on the surface. 00:01:58.000 --> 00:02:06.000 This map shows a spatial variation in the long-term creep rates measured by creepmeters as triangles and alignment arrays as circles. 00:02:06.000 --> 00:02:10.000 Here are two conceptual models for what this creep might represent. 00:02:10.000 --> 00:02:18.000 You can imagine the earthquakes at depth are driving creep towards earth's surface, whereas on the right the surface creep is arising due to the resolve 00:02:18.000 --> 00:02:22.000 tectonic shears stresses, and may in turn load the seismogenic zone at depth. 00:02:22.000 --> 00:02:30.000 So understanding the mechanics of shallow creep really is important for a fundamental knowledge of how faults work and the hazards they pose. 00:02:30.000 --> 00:02:35.000 The USGS and partners have measured surface creep using a variety of tools over the last five decades. Today, 00:02:35.000 --> 00:02:45.000 Austin Elliot and I will focus on the alignment array in creewpmeter networks describing their history, instrumentation, and the data sets that are publicly available. 00:02:45.000 --> 00:02:46.000 The longest running, continual measurements of service creep on Northern California 00:02:46.000 --> 00:02:55.000 faults come from the reoccupation and surveying of alignment arrays. With the retirement of Jim Lienkaemper 00:02:55.000 --> 00:02:57.000 I have inherited stewardship of the alignment array data sets, and so we'll give an overview of them here. 00:02:57.000 --> 00:03:06.000 Noting that all credit for their installation, routine measurement, and use goes to folks that come before me. 00:03:06.000 --> 00:03:10.000 Alignment arrays, ultimately spelled with a GN or an 00:03:10.000 --> 00:03:17.000 NE are networks of survey monuments installed on the ground surface straddling a fault trace by about a 100 meters. 00:03:17.000 --> 00:03:26.000 These are reoccupied at some time interval to measure the relative changes in position among the established monuments due to fault creep. 00:03:26.000 --> 00:03:41.000 Monuments generally consist of metal caps or nails with a precise dot, cross or hatches to enable unambiguous reset up of tripod mounted survey equipment with sub-millimeter accuracy over a known point. Good quality monuments are 00:03:41.000 --> 00:03:45.000 embedded in concrete installed within a protective vault. 00:03:45.000 --> 00:03:46.000 For various logistical reasons which I'll mention shortly. 00:03:46.000 --> 00:03:54.000 monuments must sometimes just be survey nails pounded through existing pavement surfaces. 00:03:54.000 --> 00:04:06.000 Some of the early alignment arrays in the 1960s and 1970s were installed as deflection arrays which are a line of about half a dozen or more markers perpendicular to the fault trace from which deflection by tectonic motion could be 00:04:06.000 --> 00:04:23.000 measured. In contrast, for precision of measurement, ease of calculation, and efficiency, of surveying most modern arrays consist of just 3 points, 2 straddling default, and a third that sits off axis on the same side of the fault as the survey 00:04:23.000 --> 00:04:24.000 instrument presumed to be undergoing the same fault 00:04:24.000 --> 00:04:35.000 parallel motion as the instrument station. Motion of one side of the fault relatives to the other is calculated by measuring the change in the angle formed by these two survey lines. 00:04:35.000 --> 00:04:47.000 For example, in the configuration illustrated here, right-lateral motion along the fault will result in an increasing angle between the end station and the off-axis station, as measured from the instrument station. 00:04:47.000 --> 00:04:50.000 The arrays generally spanned a 100 meters across a recognized fault 00:04:50.000 --> 00:04:56.000 trace and effort has been made to install them at sites where creep is localized on a single clear strand. 00:04:56.000 --> 00:05:04.000 Most arrays are situated within roadways which benefit from stable paved surfaces, clear sight lines, and being orthogonal. 00:05:04.000 --> 00:05:06.000 In addition to more clearly and unambiguously expressing a creeping fault. 00:05:06.000 --> 00:05:22.000 trace. Often times city monuments can be used as logistical problems do arise from this setup, including the periodic removal of monuments by repaving as well as the introduction of obstacles like moving and parked cars as well as vegetation that can sort 00:05:22.000 --> 00:05:39.000 of overgrow and interrupt the sight line. The availability and suitability of sites is not trivial, and so decades of careful work and consideration have gone into the insulation and maintenance of this survey, network. There're currently 90 alignment rates established around the broader 00:05:39.000 --> 00:05:53.000 Bay Area region. A term, which I'm using a little generously here because they extend all the way up nearly to the Triple junction in the Bartlett Springs fault. The Hayward fault is the most densely monitored with a monumentary established roughly every 2 00:05:53.000 --> 00:05:56.000 kilometers along its full length through the East Bay, pretty much all of our faults, except for the San Andreas 00:05:56.000 --> 00:06:09.000 at this latitude exhibits some level of aseismic or interseismic creep, and so the network includes multiple arrays that have been established across all the major faults through impressive expansion by my predecessor here at the Survey 00:06:09.000 --> 00:06:15.000 Jim Lienkaemper as well as Forrest MacFarlane and John Caskey, who head up the creep program at San Francisco 00:06:15.000 --> 00:06:30.000 State University. These three maps illustrate the longevity and results from the alignment array sites the earliest were installed in the late 1960s after creep was formally recognized along the Hayward fault representing in some places over half a century of 00:06:30.000 --> 00:06:36.000 data. One can see the 21st century expansion to cover the broader region. 00:06:36.000 --> 00:06:39.000 The middle map shows the latest year during which each site was reoccupied a proxy for its status. 00:06:39.000 --> 00:06:49.000 Some you can see in orange have been temporarily skipped due to the personnel and resource limitations from the COVID pandemic. 00:06:49.000 --> 00:06:50.000 While a few others shown in red have been abandoned permanently due to destruction, 00:06:50.000 --> 00:07:01.000 the loss of access, irregularities in measurements caused by soil and mass wasting processes or an absence of active creep 00:07:01.000 --> 00:07:08.000 after many measurement [indiscernible]. On the right is a map of the avid creep rate recorded each site over its life span. 00:07:08.000 --> 00:07:11.000 You can see that the highest rates are on the southern Hayward and Calaveras 00:07:11.000 --> 00:07:24.000 faults. The data from the northern California alignment arrays represent the longest-running measurement of fault creep on the planet, a legacy which makes them valuable as long-term markers of steady deformation, as well as ground truth millimeter 00:07:24.000 --> 00:07:28.000 precision records of annual creep. On the Hayward and Calaveras faults 00:07:28.000 --> 00:07:35.000 these records go back over five decades as you can see from these time series, from all sites on the Hayward fault 00:07:35.000 --> 00:07:49.000 alignment array measurements capture annual to decadal scale changes in creep rate, including local and regional responses to significant earthquakes as well as localized discrete creep events like the one on the southern Hayward in 1996. These plots rely 00:07:49.000 --> 00:07:53.000 on interpolation from calculated velocities, as sometimes the accumulated displacement is not recoverable, 00:07:53.000 --> 00:08:02.000 year over year. When, for example, a monument gets removed or paved over, and the site has to be reinstalled. 00:08:02.000 --> 00:08:06.000 This plot, further, illustrates the variability of creep along the Hayward fault, 00:08:06.000 --> 00:08:21.000 in both the spatial on the x-axis and temporal on the Y-axis dimensions, as measured from these alignment arrays phenomena that stand out in this unique data set include the slowdown at the southern end of the Hayward after the 1989 00:08:21.000 --> 00:08:37.000 earthquake The low rates observed around the presumed rupture area of the 1868 Hayward fault earthquake and changes in creep rate, both positive and negative, associated with moderate magnitude 4 size earthquakes in the East Bay. I 00:08:37.000 --> 00:08:44.000 believe there is a lot more to discover digging into these data more finely. 00:08:44.000 --> 00:08:47.000 Further major role of alignment arrays is an earthquake response, as most of the faults in our region exhibit some level of aseismic slip 00:08:47.000 --> 00:08:59.000 interseismically, it is expected that afterslip will form a large component of near-field deformation in future 00:08:59.000 --> 00:09:03.000 Major ruptures. After the 2014 South Napa earthquake 00:09:03.000 --> 00:09:04.000 Jim Lienkaemper and the earthquake geology crew 00:09:04.000 --> 00:09:22.000 illustrated the value that quickly established and reoccupied monument arrays can play in capturing and forecasting continued afterslip. Deployment of new arrays is as easy as hammering survey nails into the ground and the regional network of existing arrays provide a 00:09:22.000 --> 00:09:29.000 well-measured before and after baseline for these sort of 100 meter aperture, ground truth, measurements of offset. We expect to make use of alignment 00:09:29.000 --> 00:09:39.000 array monuments in conjunction with complementary methods, such as repeat lidar and space-born imaging like InSAR. 00:09:39.000 --> 00:09:43.000 New annual measurements are now published in a consistent USGS 00:09:43.000 --> 00:09:46.000 science-based data repository each spring. These contain CSV files of each site's full measurement 00:09:46.000 --> 00:09:56.000 history plus regional summary tables of new results. I am now the steward of these data 03:42:48.000 --> 03:42:51.00 so reach out to me if you have any questions Forrest Mcfarland and John Caskey at San Francisco State University 00:09:59.000 --> 00:10:08.000 are also major points of contact for these data sets. 00:10:08.000 --> 00:10:10.000 Now we'll transition to the creepmeters. 00:10:10.000 --> 00:10:25.000 A creepmeter measures line length changes along shallowly buried rod or wire at an oblique angle to the fault. The fault parallel displacement is proportional to one divided by cosine of the obliquity. The monuments are typically anchored to at least 3 00:10:25.000 --> 00:10:40.000 meters depth, but this varies by installation. Because the crossfault aperture of creepmeters is narrow, generally less than about 15 meters, creepmeters are most sensitive to slip in the upper 5 meters and are generally insensitive to slip at depths greater 00:10:40.000 --> 00:10:50.000 than 20 meters. The nominal resolution of the measurements is on the order of 10 microns, and the sampling interval is generally 1 or 10 minutes for most locations. 00:10:50.000 --> 00:10:59.000 The data is transmitted to Menlo Park once every 10 minutes; where it's processed and uploaded to the deformation website. 00:10:59.000 --> 00:11:06.000 This map was made using data from an Open File Report by John Langbein, Roger Bilham, Andy Snyder and Todd Erickson 00:11:06.000 --> 00:11:12.000 that will be released soon. This is a major update to the 1989 Open File Report by Sandra Schultz. Creepmeters 00:11:12.000 --> 00:11:20.000 were first installed in the mid-1960s, primarily along the San Andreas fault. In the 1980s, in preparation for the next earthquake 00:11:20.000 --> 00:11:24.000 the number of creepmeters in the Parkfield region increased to 12. 00:11:24.000 --> 00:11:38.000 The 1990s saw the installation of 5 creepmeters along the Hayward fault, and over the last 30 years or so the number of active creepmeters has gradually declined primarily along the San Andreas fault. This map shows in yellow the locations 00:11:38.000 --> 00:11:49.000 of active creepmeters today, not including recent installations by Heather Shaddox at al. in the San Juan Bautista region. 00:11:49.000 --> 00:11:52.000 Here we're looking at the long-term creep rates determined from least-squares 00:11:52.000 --> 00:11:56.000 regression of the creepmeter time series. For the Hayward fault 00:11:56.000 --> 00:12:00.000 the creep rates ranges from about 3 to 7 mm/year. 00:12:00.000 --> 00:12:03.000 On the Calaveras fault from about 2 to 10 mm/year, and then along the San Andreas 00:12:03.000 --> 00:12:14.000 the rates increase from San Juan Bautista towards the southeast, from 9 to 15 mm/year reaching about 20 mm/year 00:12:14.000 --> 00:12:19.000 in the central part of the creeping section and then decreasing again to 0 00:12:19.000 --> 00:12:23.000 south of Parkfield, where the fault is presumably locked at the surface. 00:12:23.000 --> 00:12:31.000 In addition to the variation in the secular creep, we see that there also is variation in how this creep accumulates over time. 00:12:31.000 --> 00:12:48.000 For example, these two sites along the Hayward fault have similar long-term creep rates and seasonal variations, but the Point Pinole time series in blue varies much more smoothly compared to Fremont which has a more jagged or punctuated appearance. 00:12:48.000 --> 00:12:55.000 Here's an example where you can see an abrupt increase in the creep rate, which is well above the long-term average. 00:12:55.000 --> 00:13:11.000 These abrupt jumps are referred to as creep events, and we really don't have a good mechanical or real logical understanding of what leads to the smooth or continuous behavior versus this episodic behavior. 00:13:11.000 --> 00:13:25.000 So John Langbein went through the full data sets at each site and calculated what proportion of the cumulative creep actually accrued as discrete creep events, and this analysis creep events were defined where the increment of slip in 1 day 00:13:25.000 --> 00:13:28.000 exceeded the long-term average by 10 times. 00:13:28.000 --> 00:13:36.000 What he found is that the Hayward fault is characterized by largely the smoothly or continuously creeping behavior 00:13:36.000 --> 00:13:45.000 aside from Fremont; San Juan Batista is much more episodic, and Parkfield is a combination of both behaviors. 00:13:45.000 --> 00:13:53.000 Here is all the northern California creepmeter data on one slide with the linear long-term trend in seasonal periodicities removed. 00:13:53.000 --> 00:13:57.000 We see creep accumulating in a variety of time scales. 00:13:57.000 --> 00:14:07.000 So again, the jagged appearance is due to the individual creep events, but we also see rate changes on the order of years, or even decades, and these often accompany regional earthquakes. 00:14:07.000 --> 00:14:18.000 For instance, along the San Juan Bautista section of the San Andreas fault we see these three northern creepmeters have a reduced rate leading up to the Loma Prieta earthquake. 00:14:18.000 --> 00:14:28.000 In an increased rate, following it. Meanwhile the creepmeters along the Calaveras fault recorded reduced creep rates following the Loma Prieta earthquake. 00:14:28.000 --> 00:14:33.000 Of course, the Parkfield creepmeters recorded the post-seismic after-slip response from the 2004 magnitude 6 earthquake there. 00:14:33.000 --> 00:14:42.000 However, these rate changes can't always be attributed to local or regional earthquakes. 00:14:42.000 --> 00:14:48.000 For instance, along the Hayward fault, beginning in about 2008, there are increased rates along CTM; 00:14:48.000 --> 00:14:59.000 and green and COZ and blue, and a decreased rate along CHP in pink, and I don't think that we have an explanation for this change in behavior. 00:14:59.000 --> 00:15:10.000 An additional challenge to unraveling all of these records is understanding how shallow, normally unsaturated faults on materials interact with water. 00:15:10.000 --> 00:15:24.000 We're studying this in Fremont, where a borehole crosses the dipping Hayward fault at about 21 meters depth, using an inclinometer system we're able to make institute measurements of the borhole deformation including at depths below the 00:15:24.000 --> 00:15:29.000 water table and compare with the creepmeter records from the surface. 00:15:29.000 --> 00:15:32.000 So this red curve shows the borehole fault 00:15:32.000 --> 00:15:36.000 parallel displacement versus depth between 2003 and 2021. 00:15:36.000 --> 00:15:49.000 where you can see there's been 12 cm of offset across the fault. 00:15:50.000 --> 00:15:56.000 We also see that the shallowest portion of the borehole tilts in the opposite direction, the retrograde direction. The green curves [noise] show the results of an inverse model that's solving for the slip distribution on the fault using the borehole displacements and we find that in order to 00:15:56.000 --> 00:16:14.000 match this retrograde tilt there needs to be a 30% routine in the shallowest slip, and this reduction is largely occurring above the water table, which we directly observe in the borehole and this is consistent with previous laboratory findings that faults and materials can be up to 00:16:14.000 --> 00:16:19.000 twice as strong when they're dry compared to when they're wet. 00:16:19.000 --> 00:16:35.000 Now this plot shows a fault parallel displacement versus time during a 3 mm creep event, and it appears that peak displacement occurs at depth prior to reaching the surface, so the inclinometer measurements peak and then plateau at day 00:16:35.000 --> 00:16:36.000 285, while the creepmeter measurements continue to increase, and then even accelerate 00:16:36.000 --> 00:16:50.000 during this big rain event. So again, this is consistent with the idea of inhibited slip in the shallowest unsaturated layer that's released when the fault zone is lubricated. I'll wrap up by highlighting some recent community advances 00:16:50.000 --> 00:16:56.000 including new censored designs from Roger Bilham. 00:16:56.000 --> 00:17:13.000 This includes the installation of gas sensors at several creepmeters, along with orthogonal paired sensors which Heather Shaddox talked about, and the design of multi-node sensors that would allow us to infer the shallow slip distribution. Here are three publications 00:17:13.000 --> 00:17:29.000 that came out recently using the USGS data. Hirao et al. looked at dynamic triggering and communication between the northern and southern portions of the creeping section of the San Andreas. Gittens and Hawthorne used template matching to 00:17:29.000 --> 00:17:34.000 identify creep events and estimate their long-strike size. 00:17:34.000 --> 00:17:55.000 Jiang et al. looked at how afterslip and aftershocks evolved, following the Parkfield earthquake. If you're interested in viewing or downloading the data, it's available at this website, which I will try to remember to put into the chat right now so thank you. 00:17:55.000 --> 00:18:03.000 Thanks very much. Nice presentation! Let's move on to our next 00:18:03.000 --> 00:18:18.000 talk by Roland Burgmann on "Geodetic constraints, on spatio-temporal variations of crustal deformation and slow 00:18:18.000 --> 00:18:24.000 fault slip in northern California. 00:18:24.000 --> 00:18:32.000 Okay. So radial interferometry space-based observations of surface deformation 00:18:32.000 --> 00:18:51.000 using InSAR has been around now for actually 30 years, and we've used this technology in a number of different satellite missions over central and northern California pretty much ever since to try to learn more about crustal deformation across the region. 00:18:51.000 --> 00:19:10.000 And I would love to tell you all about all the recent studies that we've been undertaking, including a regional study using the ALOS-2 L-band satellite mission of northern California by Danny Lindsey and [indiscernible] integration of 8 different SARS systems to 00:19:10.000 --> 00:19:32.000 illuminate the 3D variations of both fault creep and landsliding in the Berkeley Hills [indicernible], but in the interest of time, and with only 14.5 minutes remaining, I want to focus on two recent and ongoing 00:19:32.000 --> 00:19:53.000 studies that I think nicely illuminate those space geodetic capabilities with a focus on time-dependencies and time scales of multi-annual variations, and even decadal-scale variations, but now become accessible with space geodesy and further enhanced with 00:19:53.000 --> 00:20:04.000 observations in the field as well as seismicity repeating earthquakes that also tell us something about a fault slip at depth. 00:20:04.000 --> 00:20:05.000 So first, I want to talk about work that Jessica published in JGR 00:20:05.000 --> 00:20:16.000 this last week by Yushin Lee, former grad student here at UC 00:20:16.000 --> 00:20:21.000 Berkeley and just started a postdoc at Caltech, and she looked at SENTINEL-1 00:20:21.000 --> 00:20:42.000 InSAR deformation data. So with observations starting 2015 over the Triple Junction region between the San Andreas as fold and the southern Calaveras fault that almost merge and parallel each other, over quite some distance, here in the 00:20:42.000 --> 00:20:54.000 southernmost portion. And so these 5-year record of surface velocities nicely agrees with GPS measurements of stations across the region. 00:20:54.000 --> 00:21:00.000 So it captures deformation at just a couple millimeter per year 00:21:00.000 --> 00:21:21.000 accuracy. And by differencing the line of site displacement across faults in the region, we are able to establish what the creep rate is along these faults, and so for the San Andreas and Calaveras faults we can map 00:21:21.000 --> 00:21:36.000 what the creep, and even very slow, moving faults, like the Sargeant and Queen Sabes fault, seem to send a measurable signal of fault 00:21:36.000 --> 00:21:56.000 creep, consistent also with field observations. So if you look at the distribution of fault creep rate along the fault as a function of latitude from south to north, both along the San Andreas fault and the Calaveras fault we can map out the spatial variation quite 00:21:56.000 --> 00:21:59.000 nicely, and compare those with previous results from alignment array 00:21:59.000 --> 00:22:13.000 measurements, creepmeters, and previous InSAR studies and there's some consistencies, but also some differences. 00:22:13.000 --> 00:22:16.000 Now we're specially interested in temporal variations. 00:22:16.000 --> 00:22:22.000 So here we're looking now at fault creep variation 00:22:22.000 --> 00:22:28.000 deviations from average rate as a function of latitude 00:22:28.000 --> 00:22:34.000 again on the San Andreas and Calaveras fault, but also as a function of time. 00:22:34.000 --> 00:22:43.000 So here in these red regions, we have displacements that overshoot the average displacement, and in blue we have times where the rate is apparently decreased 00:22:43.000 --> 00:23:08.000 leading to a less than average creep along the fault and you can see that in that southernmost portion of the San Andreas and Calaveras faults in the area where they overlap both of them have decreased slip rates in need 2017 00:23:08.000 --> 00:23:15.000 to 2019 year period, with a slight difference in when that occurs. 00:23:15.000 --> 00:23:20.000 So this is shown again here in these timeseries plots. 00:23:20.000 --> 00:23:47.000 So here we have 3 sections of the San Andreas fault and in blue the D-trended and filtered with an 18 months filter curve of the creep rate variations as a function of time, and you can see this decrease below average displacement in 00:23:47.000 --> 00:23:54.000 this 2017, 2018-year time grid. On the Calaveras fault we see a similar decrease, but it's slightly shifted by half a year 00:23:54.000 --> 00:24:09.000 or so in time. Now you can also see on here creep, de-trended creep, cumulative creep, inferred from repeating earthquakes 00:24:09.000 --> 00:24:14.000 from analysis by Taka'ki Taira 00:24:14.000 --> 00:24:33.000 that seems to nicely mirror what we see in the InSAR data and then also showing here b-value of variations of seismicity in an [indiscernible] band along the San Andreas fault that also seems to have the same temporal trend. So there seems to be a persistent 00:24:33.000 --> 00:24:37.000 multi-annual variation with a decrease doing this period with us somewhat 00:24:37.000 --> 00:24:49.000 time shifted decrease, following on the southernmost Calaveras fault and Paicines fault. 00:24:49.000 --> 00:24:53.000 So what might be producing this could reflect kind of a trade off of slip between the two faults. 00:24:53.000 --> 00:25:06.000 Maybe as we keep looking this we'll see sort of a ping-ponging back-and-forth of slip on these two creeping segments of at least two faults. 00:25:06.000 --> 00:25:12.000 It could have something to do with effects of hydrological forcing. 00:25:12.000 --> 00:25:13.000 In 2016 there was a deep trout that ended, and a wetter period following. 00:25:13.000 --> 00:25:30.000 So there could be changes in stress and/or pressure on the fault that resulted in these creep rate variations. 00:25:30.000 --> 00:25:36.000 Now moving on to the second study, I wanted to discuss, looking at creep on the Hayward fault. 00:25:36.000 --> 00:25:55.000 Here we have worked in a number of studies. That's probably a handful of papers taking surface deformation data and all repeating earthquakes and other geodetic observations to infer the distribution of creep along and with depth 00:25:55.000 --> 00:26:16.000 on the fault, so here are two examples of studies that both used about 18 years of InSAR data along the Hayward, and also in this case, the Calaveras fault to map out using modeling the distribution of fault creep and locking in the subsurface both of which consistently show a 00:26:16.000 --> 00:26:22.000 more coupled, locked region at depth that presumably might be the zone that slips again in future 00:26:22.000 --> 00:26:47.000 large Hayward fault earthquakes. Now most of her Khoshmanesh, who worked with Mano Shirzaei, a few years back, looked at the 2015 to 2020 Sentinel-1 data to pretty much repeat that exercise, and and see how these more recent and very 00:26:47.000 --> 00:26:54.000 high-quality data might change the picture and the goal was also to expand the analysis along the Rodgers Creek fault. 00:26:54.000 --> 00:27:03.000 Again, the data agreed very nicely with GPS velocities and surface creep measurements. 00:27:03.000 --> 00:27:07.000 Now in the model that most of I came up with the coupling looks very different. 00:27:07.000 --> 00:27:19.000 Now we were still trying to dig deeper into possible differences in how the data were treated, the smoothing and the model inversion, 00:27:19.000 --> 00:27:28.000 but there's certainly a first order decrease in coupling highest slip-rate overall in the subservice 00:27:28.000 --> 00:27:35.000 inferred from this later data set compared to the early ERS and Envisage data. 00:27:35.000 --> 00:27:50.000 So color contours on this upper plot indicate the apparent acceleration, positive acceleration along pretty much the whole Hayward fault from Point Pinole to Fremont. 00:27:50.000 --> 00:28:07.000 Now to assess this, we're also looking at creep rates estimated from repeating earthquakes that Taka Taira analyzed along the Hayward and Calaveras faults. 00:28:07.000 --> 00:28:12.000 So in the top panel we see the raid inferred from repeating earthquakes 00:28:12.000 --> 00:28:36.000 prior to 2010 on the Hayward fault and converted through calibration equations to a great rate, and at the bottom we see creep rates following 2010 and these are in distance from Point Pinole. 00:28:36.000 --> 00:28:56.000 So you can see on the Hayward fault there seems to be in many areas a somewhat higher creep rate; whereas on the Calaveras fault south of the junction of the two faults, the creep rate apparently decreased between these two time periods. We 00:28:56.000 --> 00:29:03.000 can plot up the rate, change in zones where we had enough repeating earthquake before and after the earthquake. 04:01:55.000 --> 04:02:16.00 And so we see this persistent acceleration of creep on the Hayward fault, deacceleration on the Calaveras fault, also seen in this pre- to post-creep rate estimate where the color indicates distance from Point Pinole. These higher rates along the 00:29:24.000 --> 00:29:34.000 Hayward fault in the northern 50 or so kilometers, lower inferred creep rates from repeating earthquakes inferred on the Calaveras fault 00:29:34.000 --> 00:29:58.000 to the south. Then we see similar patterns in surface creep rates, measured, using alignment arrays in many decades of effort by Jim Lienkaemper and a number of people from San Francisco State University and at the USGS. So you can see along the Hayward 00:29:58.000 --> 00:30:03.000 fault from Point Pinole to Fremont persistently higher rates 00:30:03.000 --> 00:30:11.000 Since 2010 compared to before. Again, an Xy plot shows this consistent acceleration. 00:30:11.000 --> 00:30:24.000 So there appears to have been an increase overall in creep rate along the Hayward fault over the last few decades. 00:30:24.000 --> 00:30:28.000 And so the question is, what might this be due to? This 00:30:28.000 --> 00:30:33.000 could have to do with fault interactions and earthquake cycle effects. 00:30:33.000 --> 00:30:35.000 We know the southern Hayward fault and Calaveras fault 00:30:35.000 --> 00:30:53.000 decelerated due to static stress interactions with the Loma Prieta rupture in the aftermath of Loma Prieta, and since recovered some. And we also think that maybe the lingering stress shadow from 1906 could still be decaying over this 00:30:53.000 --> 00:31:00.000 decadal time scales. Maybe again, that could be some climatic effect. 00:31:00.000 --> 00:31:01.000 But there's does seem to be a persistent increase. 00:31:01.000 --> 00:31:23.000 Now this reminded me a little bit of what was observed by Mavrommatis et al. in the decades prior to the Tohoku earthquake, where the faults surrounding the locked disparity that eventually erupted in 2011 accelerated 00:31:23.000 --> 00:31:28.000 as evidenced by GPS, as well as repeating earthquake data in the decades prior to the rupture. 00:31:28.000 --> 00:31:42.000 We cannot say anything about this having to do with this kind of precursory process, but it's an intriguing comparison that I hope you forgive me. 00:31:42.000 --> 00:31:52.000 So in conclusion, we find changes in fault creep rate on the partial coupled San Andreas, 00:31:52.000 --> 00:31:58.000 Calaveras, and Hayward faults on multi-annual to decadal time scales. 00:31:58.000 --> 00:32:17.000 Those presumably reflect changes in the applied stress that might be either tectonic or non-tectonic in nature, or changes in fault strength that might reflect pore-pressure variations or frictional properties on the fault zone. And I'm looking forward to questions and discussions 00:32:17.000 --> 00:32:25.000 with you following this session. Thank you so much. 00:32:25.000 --> 00:32:31.000 Thank you. Yeah. I think we had a a good exchange of questions and discussion on the chat 00:32:31.000 --> 00:32:41.000 already, and hope I have some more at the discussion section. Let's move on now to our next presentation 00:32:41.000 --> 00:32:50.000 by Greg Beroza at Stanford, where he teaches, but also involved in other activities. 00:32:50.000 --> 00:33:08.000 And I know it's a long list of co-authors and the presentation will be "Some recent developments in earthquake monitoring." 00:33:08.000 --> 00:33:13.000 Hello! My name is Greg Beroza. I teach in the Geophysics Department at Stanford. 00:33:13.000 --> 00:33:22.000 I want to thank you for the opportunity to give an update on our research [noise] and in particular, our efforts for improved earthquake development. 00:33:22.000 --> 00:33:29.000 These continue to be driven by the methods of AI, and in particular, machine learning. We get support for this effort from the USGS, of course, but also from the Department of Energy and from the Air Force. 00:33:29.000 --> 00:33:52.000 Research Laboratory. So as some of you may know we've been at this for a few years, but there's still a lot of work to do, and I think we're still on this steep part of the curve for making progress in developing what I like to call deep catalogs. I also think there's an 00:33:52.000 --> 00:33:57.000 unmet need to explore these catalogs, both to understand what's in them, 00:33:57.000 --> 00:34:11.000 what's not in them, how they differ by technique, and and also explore them in the sense that I think they have tremendous potential to teach us new things about earthquake processes. 00:34:11.000 --> 00:34:33.000 So the basic task is, we are approaching it at least is to take continuous records of ground motion or waveforms, to develop a catalog of discrete seismic events, that is as comprehensive that is as complete as it can possibly be. This is classically represented as being done through a 00:34:33.000 --> 00:34:41.000 workflow that looks something like this with a linked set of tasks that are, sort of modular, and we have developed machine learning models, 00:34:41.000 --> 00:34:51.000 sometimes several of them, for each of these tasks, and these can be you know plugged in as desired, and in a modular manner. 00:34:51.000 --> 00:34:56.000 So when we go about doing this, we use what we know about the nature of the test. 00:34:56.000 --> 00:35:05.000 What we know is seismologists from that to inform the machine learning approach and architecture that we think should be best suited to carrying it out. 00:35:05.000 --> 00:35:13.000 Of course, that changes with time and so these models, these methods are naturally change with time as well. 00:35:13.000 --> 00:35:21.000 Once these have been reviewed in published we may make our models openly available through Github, and we continue to work to improve 00:35:21.000 --> 00:35:33.000 there performance. So I think it's fair to say that AI in general and machine learning in particular, has had a dramatic impact on seismology. 00:35:33.000 --> 00:35:58.000 Most of that has occurred through supervised learning. The blue part of the bars in this plot, in which neural networks learn complex relationships that are embedded in data by being exposed to and learning from very large label data sets that are full of instructive examples. So the examples 00:35:58.000 --> 00:36:19.000 I will show in this talk are all supervised methods, and/or either that or they are data that are are organized and curated in such a way that they will support the development of supervised methods. Having said that I think there is great promise perhaps the greatest promise for the future, in 00:36:19.000 --> 00:36:23.000 unsupervised methods, so we shouldn't forgive about those. 00:36:23.000 --> 00:36:41.000 I should also mention that the for earthquake monitoring which tasks, which are sort of the lower half of this plot, has seen an outsized effort in the application machine learning methods and seismology, and I expect that that will broaden with time. 00:36:41.000 --> 00:36:42.000 Okay. So the first one, Ian McBrearty, he's a grad student shown here on the left. 00:36:42.000 --> 00:36:53.000 He's developed a graph neural network based approach to earthquake phase association that we call genie. 00:36:53.000 --> 00:36:57.000 So graph neural networks are designed to work with non-Euclidean data. 00:36:57.000 --> 00:37:17.000 And by that I mean that it's data that's organized on an irregular collection of nodes, without necessarily any canonical or natural ordering of seismic network is an example of an irregular collection of nodes that can be represented with a graph and it differs 00:37:17.000 --> 00:37:36.000 From regular data, like a one-dimensional time series, which has a constant DT or a two-dimensional image which has regularly spaced pixels on those sort of data we can use convolution in a straightforward manner, but for irregular data 00:37:36.000 --> 00:37:47.000 we need to do graph convolutions that take into account the structure of the data using things like adjacency matrices 00:37:47.000 --> 00:37:58.000 so that our methods can learn things that relevant to say phase association like move out across an irregular seismic network. 00:37:58.000 --> 00:38:02.000 So phase association is a deceptively difficult problem 00:38:02.000 --> 00:38:06.000 it's instructive to show how genie is able to make phase associations for small events in which only a tiny fraction of a seismic network is illuminated. 00:38:06.000 --> 00:38:14.000 That will be the case, for most earthquakes, which, after all, are small. 00:38:14.000 --> 00:38:18.000 The left panel shows the associated phases blue for P and orange for S. 00:38:18.000 --> 00:38:30.000 The middle panel shows the the total phases that they're associated from, and the vast majority of arrivals in this example are unassociated and perhaps false detection or artifacts. 00:38:30.000 --> 00:38:38.000 And the right panel shows that it's relative to the events that their association were shown in green. 00:38:38.000 --> 00:38:43.000 The red stations that have associated phases make sense. 04:11:35.000 --> 04:11:41.00 If we have a larger earthquake, we need to associate that as well, and this shows the P and the S 00:38:49.000 --> 00:38:54.000 wave associations for a magnitude 2 earthquake across the network, and so on. 00:38:54.000 --> 00:39:03.000 This is for a magnitude 3, which it illuminates something like half of the network with detectable phases and an effective phase associator has to be able to deal with 00:39:03.000 --> 00:39:06.000 all these contingencies. So anyway, that's the examples from the genie associator. 00:39:06.000 --> 00:39:14.000 We applied it to a 100-day sample of data from the greater San Francisco Bay Area. 00:39:14.000 --> 00:39:23.000 These got the orange events. These are not relocated. 00:39:23.000 --> 00:39:40.000 So the the distribution is pretty fuzzy, but there's something like four times as many earthquakes as there are in the USGS catalogue, including many more aftershocks of the magnitude 4.6 event that happened on the creeping section during this time period. 00:39:40.000 --> 00:40:02.000 So we can zoom in on that close in space and time to that location, and we can see that on the left, the not surprisingly, the newly detected events are small in the magnitude 0 to 1 range, and we detect many of them at least in part because machine learning phase pickers 00:40:02.000 --> 00:40:05.000 like PhaseNet, which is used here, are super effective at detecting S-waves and S-waves for local earthquakes tend to be much larger than P- 00:40:05.000 --> 00:40:20.000 waves, and hence emerge first above the noise for small events. On the right it shows that we have many, many more 00:40:20.000 --> 00:40:27.000 aftershocks following the morey-like decay to this creeping section event. 00:40:27.000 --> 00:40:37.000 Okay, if we form receiver gathers or we take those associated events, grab the seismograms and compare them to one another. 00:40:37.000 --> 00:40:46.000 We can see that the waveforms are extremely similar, which gives us confidence that these are real, correctly associated 00:40:46.000 --> 00:40:55.000 events. Now our goal in doing this is to do something like what Yongsoo Park did and published last year for Oklahoma Kansas. 00:40:55.000 --> 00:41:03.000 So Yongsoo published a decadal catalogue tenure catalog for that region that had over 300,000 events in it. 00:41:03.000 --> 00:41:08.000 That's something like a little over 20 times as many as in Comcat. 00:41:08.000 --> 00:41:11.000 Most of the faults are near vertical strike-slip. 00:41:11.000 --> 00:41:14.000 So, it is relatively easy to visualize what's going on. 00:41:14.000 --> 00:41:16.000 We've seen lots of conjugate faulting. 00:41:16.000 --> 00:41:25.000 We see features that affect the activity, such as the Nemaha Ridge, shown in yellow which is thought to be a barrier to flow in these earthquakes 00:41:25.000 --> 00:41:28.000 that's relevant because these earthquakes are thought to be induced. 00:41:28.000 --> 00:41:29.000 There are other features that are a little more mysterious, like the yellow boundary on the right to the east. 00:41:29.000 --> 00:41:42.000 It shows us sharp cut off of activity without a clear geologic explanation for it. 00:41:42.000 --> 00:41:45.000 That one processing this data, Yongsoo analyzed about 700 station years of continuous seismic data. 00:41:45.000 --> 00:41:55.000 So it was a big, a big effort. Ian is going on order of magnitude larger. 00:41:55.000 --> 00:42:00.000 So for northern California we have 20 years of continuous data. 00:42:00.000 --> 00:42:20.000 He's looking at processing on the order of 10,000 station years of data and the sort of protoplasmic blob shown on the left indicates two things; one, it's, you know, in our locations and the other not ready to to show you in any detail but we're getting a handle 00:42:20.000 --> 00:42:41.000 on the computation at this scale. So, without showing our, you know, approximate locations, we can see that the detections or the south Napa earthquake and the San Ramon swarm in 2015 indicate many, many, more detections in the catalog at a factor of 4 or 5 00:42:41.000 --> 00:42:46.000 more than in the standard catalog. 00:42:46.000 --> 00:42:47.000 So I mentioned supervised methods earlier in this talk, and how they need data. 00:42:47.000 --> 00:43:02.000 They're data hungry. And so Leo Aguilar is developing an earthquake waveform database for the regional seismic phases 00:43:02.000 --> 00:43:16.000 Pn and Sn. So we have developed and we have published a catalog of local earthquake waveforms a couple of years ago, which we called Stead. 00:43:16.000 --> 00:43:20.000 Leo is doing this for the regional phases 00:43:20.000 --> 00:43:26.000 Pn and Sn. So as seismologists in the crowd know well, these are important phases, but they're trickier 00:43:26.000 --> 00:43:34.000 then the direct Pg and Sg phases are more spread out in time, so we've had to adjust our practice. 00:43:34.000 --> 00:43:37.000 Leo has searched globally over multiple data centers. 00:43:37.000 --> 00:43:44.000 Most productively, the ISC and the NEIC regional phase labels from around the world, and in doing so collected. 00:43:44.000 --> 00:43:50.000 What's shown here over almost 6.5 million example 00:43:50.000 --> 00:44:06.000 waveforms. Now, there's issues with these data. And so he developed a machine learning model that seeks to identify the interval between P 00:44:06.000 --> 00:44:31.000 and S on a potential seismic waveform, and if it has sufficient quality, then with the labels that are defined from the PNS wave arrival, then he deems it acceptable, and if it has poor quality, then we reject it. So we're using machine learning 00:44:31.000 --> 00:44:33.000 here for quality control, uses a pretty simple model, and and makes it pretty simple measurement. So why do we have to do that? 00:44:33.000 --> 00:44:44.000 Well, we we actually reject the majority of the data more than half of the data. 00:44:44.000 --> 00:44:54.000 It may be rejected because it's too noisy, because the waveforms weren't recovered properly, which can happen because the labels themselves are bad. 00:44:54.000 --> 00:44:56.000 In the catalogs, or because there's more than one earthquake, which happens very frequently more than one earthquake in the 5 minute time window 00:44:56.000 --> 00:45:08.000 we're using the last thing we want to do is to teach a machine learning model to detect small earthquakes as 00:45:08.000 --> 00:45:18.000 one of the tasks were trying to do better. So anyway, there's this quality control step, which is also machine learning powered that we employ. After that process we're still left with plenty of data, 00:45:18.000 --> 00:45:26.000 3 million examples, expresses 3 component 5 minute samples and an aggregate amount about 30 session years of label data. 00:45:26.000 --> 00:45:46.000 So no shortage of data at all we can afford to pick. So for the earlier work, you may be familiar with Weiqiang Zhu phase model, which was sponsored by the USGS support and trained on NCSN data. It's since been used around the world 00:45:46.000 --> 00:45:55.000 By many investigators to detect local earthquake phases very successfully. Now, Weiqiang treated arrival time measurement as a semantic segmentation problem. 00:45:55.000 --> 00:45:56.000 He trained PhaseNet to classify continuous waveforms as either P- 00:45:56.000 --> 00:46:04.000 waves, S-wave or noise. That's what we mean by semantic segmentation. 00:46:04.000 --> 00:46:05.000 But he engineered it, and cleverly, in such a way that the P- 00:46:05.000 --> 00:46:10.000 wave and the S-wave probabilities would peek at the arrival time. 00:46:10.000 --> 00:46:22.000 Labels for P- and S-waves, and so the output is a time series of classification probabilities that peaks at the labeled arrival times from which we get the measurement. 00:46:22.000 --> 00:46:23.000 Now postdoc, Artemis Novoselov has taken a different approach. 00:46:23.000 --> 00:46:35.000 He's treating machine learning arrival time measurement as a regression problem in which the input is the same three component waveform. 00:46:35.000 --> 00:46:39.000 But the output is a single number. The arrival time of a seismic phase. 00:46:39.000 --> 00:46:42.000 This has some advantages. The arrival time is what we're after, 00:46:42.000 --> 00:46:43.000 of course and there's a potential, at least in principle, to get some sample arrival 00:46:43.000 --> 00:46:57.000 time measurements through this kind of approach. Now one of the things he did is to use something called subsampling. 00:46:57.000 --> 00:47:01.000 It's an intermediate approach between ensembling and dropout. 00:47:01.000 --> 00:47:15.000 His model, which he calls PhaseHunter out uses these subnetworks to develop multiple estimates to the arrival time such that he can create a distribution somehow captures the uncertainty of the measurement. 00:47:15.000 --> 00:47:19.000 So this should be really helpful in giving the measurements appropriate 00:47:19.000 --> 00:47:28.000 weights. This animation shows how phase picking uncertainty which is shown is this distribution varies as the noise increases, not surprisingly 00:47:28.000 --> 00:47:32.000 noise as it increases, interferes with accurate arrival 00:47:32.000 --> 00:47:35.000 time measurements, and that's reflected in the width. 00:47:35.000 --> 00:47:37.000 The character of the distribution also due to its larger amplitude 00:47:37.000 --> 00:47:42.000 it consistently picks S-waves more accurate than P- 00:47:42.000 --> 00:47:48.000 waves, at least in this example. So I'll conclude by saying that there's a lot of activity and a lot of progress in earthquake monitoring using AI. 00:47:48.000 --> 00:47:56.000 I've shown three examples of works in progress that indicate there's still a lot to do. 00:47:56.000 --> 00:48:05.000 By this time next year we hope to have a deep earthquake catalog for northern California, using the available continuous data. We're working with archival data, 00:48:05.000 --> 00:48:06.000 but there's no reason that we can't work with streaming data or 00:48:06.000 --> 00:48:14.000 you can't work with streaming data and process things in real time. 00:48:14.000 --> 00:48:20.000 And then, you know, I think it's gonna remain a dynamic environment in terms of catalog development. 00:48:20.000 --> 00:48:25.000 And we have to get serious. We have to get creative about extracting understanding from these results. 00:48:25.000 --> 00:48:36.000 So thank you for your attention, and I'll stop my talk there. 00:48:36.000 --> 00:48:46.000 Thank you very much, Greg. Very fascinating to see all these latest advancements and earthquake detection. 00:48:46.000 --> 00:48:56.000 So we'll move on to our next presentation by Russ 00:48:56.000 --> 00:48:57.000 Graymer from USGS "On developing a geology-based 3D seismic velocity 00:48:57.000 --> 00:49:11.000 model of the Central California Coast Ranges." 00:49:12.000 --> 00:49:19.000 The presentation about the Central California Coast Ranges 3D geologic map and seismic velocity model my name is Russell Grammer, and I'll be taking you through the progress 00:49:19.000 --> 00:49:31.000 we've made in the last year. 00:49:31.000 --> 00:49:54.000 Years ago, we [indiscernible] of PG&E CRADA money and leverage that to create a initial 3D geologic model of the Central California Coast Ranges between the Western transverse ranges in the South and Monterey Bay in the north bounded on the east by the 00:49:54.000 --> 00:50:00.000 San Andreas fault, and on the west by the coastline. 00:50:00.000 --> 00:50:08.000 We used the 3D-fault model from the 2008 CRADA-funded effort. 00:50:08.000 --> 00:50:16.000 The 3D block model from Jachens et al. he added some additional faults, , 00:50:16.000 --> 00:50:33.000 the Irish hills and around lenses of uplifted basement in major fault zones and then we constructed a top basement surface from sparse wells, widely space cross-sections and downward projections surface geologic mapping. 00:50:33.000 --> 00:50:42.000 We also incorporated two more detailed models of the top basement surface that were available at the time. 00:50:42.000 --> 00:50:46.000 In cyan, you see the Santa Maria Basin model; Sweetkind et al. and in magenta 00:50:46.000 --> 00:50:56.000 you see the the model west of the San Andreas fault between Parkfield and Pinnacles. 00:50:56.000 --> 00:51:03.000 By Roberts et al. 00:51:03.000 --> 00:51:14.000 So the new model we have by fault framework so revised the top basement surface to correct errors and add more detail. 00:51:14.000 --> 00:51:22.000 We have subdivided the strata above the top basement surface into three stratographic packages. 00:51:22.000 --> 00:51:26.000 Cretaceous, Heleogene, and Neogene. 00:51:26.000 --> 00:51:52.000 And we've added additional detail in key areas. Here's an example of the modification of the fault framework in the version 0.1 model, the original model we had based our faults solely on downward projection of the map surface traces in some places like the Abel Mountain 00:51:52.000 --> 00:52:06.000 Thrust this had led to problems. You could see the Abel Mountain Thrust map surface-trace ends in a hook around to the southeast. 00:52:06.000 --> 00:52:15.000 This created an extension of that cut across geology and the gravity signal. For the version 0.2 model 00:52:15.000 --> 00:52:34.000 we have added a blind fault extension to the southwest, based on details of the surface geology and the gravity signal that better honors the geologic relations. 00:52:34.000 --> 00:52:39.000 Thinking to that same area. You can also see an example of how we have modified top basement surface. 00:52:39.000 --> 00:52:52.000 The dots represent the structure contours of the original model; the red lines represent the new structure contours. 00:52:52.000 --> 00:52:58.000 You can see in many places they are the same. But, for example, above the Abel Mountain Thrust they are quite different. 00:52:58.000 --> 00:53:24.000 The much more detailed near the surface contact, in order to better capture the interplay between the topography and the modeled surface, and also more detailed 00:53:24.000 --> 00:53:25.000 attempt to capture the folded nature of the context 00:53:25.000 --> 00:53:43.000 on the southwest, and of course adjusted to fit the new fault framework. 00:53:43.000 --> 00:53:53.000 Here are the structured contours on the tops of the stratographic surfaces that subdivide the above basement strata. 00:53:53.000 --> 00:54:04.000 The green structure contours are on the top basement. The orange on Top Cretaceous and the yellow on Top 00:54:04.000 --> 00:54:21.000 Paleogene. Of course, there's no structured contour on Top Neogene because that is the uppermost units of top of that where it exists is the topographic service. 00:54:21.000 --> 00:54:41.000 Zooming into an area between the Santa Ynez and the Santa Ynez River faults in the Western Transverse Ranges, here's an example of the Top Cretaceous surface construction, you can see that it's quite a bit more detailed than the original 00:54:41.000 --> 00:54:45.000 Top basement. We have both again added a lot of detail 00:54:45.000 --> 00:55:01.000 near the surface contact of the unit in question to better incorporate the interaction between the surface and the and the topography. 00:55:01.000 --> 00:55:08.000 You'll also see that we have incorporated 00:55:08.000 --> 00:55:18.000 Intra-block faults that offset the stratigraphy, but are too minor to 00:55:18.000 --> 00:55:28.000 have in the model as a block-bounding faults are now modeled as near-vertical on [indiscernible]. 00:55:28.000 --> 00:55:34.000 Basically we already have more than 70 fault blocks in this model 00:55:34.000 --> 00:55:42.000 it's impossible to incorporate all of the faults that actually offset the modeled surfaces. 00:55:42.000 --> 00:56:01.000 The near vertical approximation is decent, it's quite good for strike-slip faults decent for normal faults and not as good for reverse and thrust faults. 00:56:01.000 --> 00:56:04.000 Sticking to that same area here's the Top-Paleogene surface. Again, 00:56:04.000 --> 00:56:13.000 you can see those near vertical contacts that represent the faults internal to the block. 00:56:13.000 --> 00:56:26.000 You can also, I think, hope see that the space between the structure contours is not the same everywhere, but it's variable. 00:56:26.000 --> 00:56:27.000 It added lot of work to try to incorporate the along-strike changes in dip. 00:56:27.000 --> 00:56:40.000 This is a big difference between versions 0.2 and 0.1 is 0.1 00:56:40.000 --> 00:56:49.000 we assumed that the dip was uniform 00:56:49.000 --> 00:57:14.000 across large stretches. Now we've done a much more careful efforts to incorporate local changes in dip along- strike. We've also incorporated places where the units truncate downward onto the underlying surfaces so in the northwest corner of this block 00:57:14.000 --> 00:57:20.000 you'll see structure contours that appear to just stop. 00:57:20.000 --> 00:57:32.000 This is actually where the Paleogene unit pinches out downward against the underlying Cretaceous. 00:57:32.000 --> 00:57:36.000 Here's an example of an extra detailed area. We zoomed in on the Irish Hills. 00:57:36.000 --> 00:57:41.000 You can see by how pixelated it is just how far we've zoomed in. 00:57:41.000 --> 00:57:45.000 You can also see that there's quite a lot of detail here. 00:57:45.000 --> 00:57:58.000 We've incorporated the multiple stacked, blind faults and overlapping Top Basement surfaces that we presented in the SSHAC meeting back in 2010. 00:57:58.000 --> 00:58:07.000 All that detail is now incorporated into the 3D model. 00:58:07.000 --> 00:58:15.000 The latest addition is now that we have a finalized 3D 00:58:15.000 --> 00:58:32.000 geologic model working to bring in the seismic velocity relations with developing stratographic package specific depth versus velocity relations from velocity logs and oil and gas wells. 00:58:32.000 --> 00:58:42.000 This map shows the blue dots are the wells, the straight vertical wells with velocity logs. 00:58:42.000 --> 00:58:45.000 And this is a plot of the data from those straight vertical wells. 00:58:45.000 --> 00:58:49.000 This is very preliminary data. I just got this last week. 00:58:49.000 --> 00:58:53.000 We have some known quality control issues to this data. But I thought I would go ahead and show it. 00:58:53.000 --> 00:59:03.000 The black dotted line is the exponential trend line for the 00:59:03.000 --> 00:59:10.000 the light orange Neogene data. Most of the data that we have is in the Neogene. 00:59:10.000 --> 00:59:16.000 You'll notice that our data only goes down to about 1,300 meters below sea level. 00:59:16.000 --> 00:59:25.000 You see, this is a big difference between this area and the the Delta where our data went quite a bit deeper. 00:59:25.000 --> 00:59:31.000 So we're going to have to figure out how to best project downward 00:59:31.000 --> 00:59:39.000 below the volume where we actually have observe velocities. 00:59:39.000 --> 00:59:48.000 We also are working to bring in additional wells that aren't straight vertical wells, but are deviated 00:59:48.000 --> 00:59:54.000 that takes another step to bring in the deviation log to both to 00:59:54.000 --> 01:00:04.000 calculate the true vertical depth, and to make sure that the lateral variation hasn't taken us into a different unit. 01:00:04.000 --> 01:00:12.000 For example, drilling laterally across the fault. 01:00:12.000 --> 01:00:32.000 So, to sum up, we now have a finalized 3D geologic map and we're working on a bringing in velocity log data to create a unit, specific philosophy depth relation. 01:00:32.000 --> 01:00:34.000 We've learned a few lessons along the way. First, 01:00:34.000 --> 01:00:39.000 this is a very large area, probably too large for a 3D model to sit. 01:00:39.000 --> 01:00:45.000 The large area required many simplifications some of which we've already discussed. 01:00:45.000 --> 01:00:56.000 We've learned that the widely spaced cross sections that we tried to use to to constrain the version 0.1 model were not sufficient. 01:00:56.000 --> 01:01:14.000 You need to depict 3D geology if you're gonna use cross-sections, they have to be much more closely spaced, which obviously relates to the large area of the model much easier to get closely based cross sections in a smaller area. 01:01:14.000 --> 01:01:20.000 It's also important to account for the effective topography and regions of fire relief. 01:01:20.000 --> 01:01:27.000 You can't assume that the subsurface structure contours are parallel to the surface contact. 01:01:27.000 --> 01:01:43.000 The next steps are to finish developing the depth versus velocity relations, and to integrate those with a 3D geologic model to create the 3D seismic velocity model and to get the whole thing peer-reviewed and published transforming 01:01:43.000 --> 01:01:59.000 it from version 0.2 to version 1.0. Longer term we're hoping to revisit this model and to upgrade it by incorporating offshore geology. 01:01:59.000 --> 01:02:07.000 I know we stop right at the shoreline, but we know that there's a depiction at least in the near shore of the offshore 01:02:07.000 --> 01:02:14.000 geology, and we would also like to add even more detail in critical areas. 01:02:14.000 --> 01:02:22.000 We also want to work to continue to try to integrate geology-based 3D seismic velocity models like this one with tomographic models. 01:02:22.000 --> 01:02:49.000 I feel that each each approach has strengths and weaknesses, but that combined approach could produce velocity model that would be superior, that would be superior to that produced by any one approach. Thank you for your attention. 01:02:49.000 --> 01:02:53.000 No, thank you. Russ, that's excellent progress. That's amazing, 01:02:53.000 --> 01:03:02.000 it's really a large area, as you say, and an ambitious project, but with good progress. 01:03:02.000 --> 01:03:20.000 So our last talk today will be by Evan Hirakawa from the USGS on the "Current state and future directions for the USGS San Francisco Bay Area region 3D 01:03:20.000 --> 01:03:30.000 seismic velocity model (SF-CVM)." 01:03:30.000 --> 01:03:36.000 Hello, everyone! I'm Evan Hirakawa from the USGS earthquake Science Center in Moffett Field. 01:03:36.000 --> 01:03:39.000 Today I'm going to give a broad overview of the USGS 01:03:39.000 --> 01:03:44.000 San Francisco Bay region 3D geology and seismic velocity models. 01:03:44.000 --> 01:03:58.000 So these are really the primary velocity models that are in use for the San Francisco Bay Area, and in this talk I'm gonna touch on development history of these models, show some features of the model in the most current versions and things that were updated by myself and 01:03:58.000 --> 01:04:07.000 Brad Aagaard recently and briefly talk about where we'd like to see the model go in the future. 01:04:07.000 --> 01:04:10.000 Seismic hazard and risk is high overall in the Bay Area. 01:04:10.000 --> 01:04:17.000 We've known this for hundreds of years since there's been multiple, disastrous and deadly earthquakes just in historic times. 01:04:17.000 --> 01:04:24.000 But understanding the relative amount of hazard from site-to-site is very difficult, and can be highly variable. 01:04:24.000 --> 01:04:37.000 One thing we've understood for a long time is that the subsurface geology plays a huge role in determining how the shaking may look at any given location, and this became very clear after the 1989 Loma Prieta Earthquake where we saw amplified 01:04:37.000 --> 01:04:43.000 ground motions in Oakland above what's known as this very soft, low velocity Bay mud. 01:04:43.000 --> 01:04:57.000 Relative to sites where the shaking wasn't as bad on top of bedrock, or even on sand and alluvium, and because of this the Nimitz freeway collapsed in one part above the Bay mud, but was actually left standing above the alluvium and so 01:04:57.000 --> 01:05:10.000 capturing these kind of effects, and the variability is hard with typical hazard analysis, which is done mostly through empirical models, and we can reproduce some of these kinds of effects with 3D physics-based computer simulations. 01:05:10.000 --> 01:05:18.000 But we really need a good representation of the subsurface to trust them. 01:05:18.000 --> 01:05:33.000 So in light of this, and in preparation for simulations of the 1906 San Francisco earthquake on its hundredth anniversary, a group of USGS scientists led by Bob Jachens developed the USGS 3D geologic model in the San Francisco Bay 01:05:33.000 --> 01:05:48.000 Region. So this is a model that includes many of the local faults with detailed fault geometry obtained from double difference earthquake relocations and stratigraphy, and other basin structures, and other geologic structures obtained from a combination of interpreting 01:05:48.000 --> 01:05:53.000 geologic surface maps. Borehole logs and bringing some potential field 01:05:53.000 --> 01:05:57.000 geophysics like gravity and and magnetics. 01:05:57.000 --> 01:06:13.000 The resulting model has 25 fault services, 26 fault blocks, which are what we call these areas kind of grouped and split up by the faults, 33 different lithologic units, or zones, as we call them, which are shown by the different units, in the different colors 01:06:13.000 --> 01:06:20.000 here. Each one of these is able to hold some kind of different seismic velocity property. 01:06:20.000 --> 01:06:23.000 And then the way that we assign velocities to these zones are by what we often call velocity 01:06:23.000 --> 01:06:33.000 verse depth rules which were originally fit by Tom Brocher from boreholes spread throughout northern California. 01:06:33.000 --> 01:06:40.000 So this figure demonstrates how we assign a different one of these kind of velocity depth rules which are shown in these curves 01:06:40.000 --> 01:06:45.000 to each one of these different units, and we end up with seismic velocity models with all the known velocity interfaces, basins, and other structures based on the geology. 01:06:45.000 --> 01:07:02.000 So, for example, here's the San Andreas fault and the Hayward fault, and you can see that there's some clear velocity contrasts across these which follows very closely with the mapped fault traces. 01:07:02.000 --> 01:07:09.000 So a little bit of version history here the first version of the velocity model was version O5, that stands for 2005. 01:07:09.000 --> 01:07:15.000 And like I said, this was originally developed to do these big 1906 earthquake simulations 01:07:15.000 --> 01:07:23.000 and the way it also uses a name in simulations of the Loma Prieta earthquake to try to kind of validate the model, since we have actual recordings from that event. 01:07:23.000 --> 01:07:31.000 But it's hard to tell which discrepancies in simulated motions are coming from the source and the geologic structure. 01:07:31.000 --> 01:07:36.000 So one way to evaluate the model in a different way is to just use a much smaller earthquake. 01:07:36.000 --> 01:07:39.000 So here's a kind of evaluation study by Artie Rogers. 01:07:39.000 --> 01:07:55.000 [brief black screen] So, in order to eliminate the complexity coming from the source, they just use point source representations of these moderate earthquakes, and they found that the 3D model predicts the observed waveforms well and reproduces features that are arise from the basins, etc. 01:07:55.000 --> 01:07:58.000 And they made some suggestions like, for example, they found that the shear velocity was on average about 4 to 5% too high 01:07:58.000 --> 01:08:06.000 near the surface. 01:08:06.000 --> 01:08:07.000 So these recommendations, and maybe some others, led to the 2008 updates. 01:08:07.000 --> 01:08:24.000 So this was version 8.3.1. Let's address some of those surface wave velocities that I mentioned also made some changes to bring the velocities closer to what was being found by Cliff Thurber et al. 01:08:24.000 --> 01:08:31.000 2007 tomagraphic model, and then also to reconcile with some more recent refraction and borehole data 01:08:31.000 --> 01:08:35.000 down at the time. Modelers got quite a bit of use out of the 2008 version. 01:08:35.000 --> 01:08:55.000 Here's for example, a suite of magnitude 7 earthquake simulations done by Aaggard et al., which led to the well-known HayWired Scenario continuing to be used as computing power got better up to these high resolution 4 hertz scenarios and is 01:08:55.000 --> 01:09:03.000 also most recently was being used by this DOE ex-scale computing project where they're coupling these really high frequency 01:09:03.000 --> 01:09:11.000 10 Hertz synthetic motions to actual structure simulations. 01:09:11.000 --> 01:09:26.000 So this is where I came into the USGS around 2018 or 2019 when I worked on my first project evaluating the old version, the 2008 version of the velocity model, and most of this work is published in this BSSA paper Hirakawa and 01:09:26.000 --> 01:09:34.000 Aagaard, 2022. But we're doing something similar to the last evaluation study I described where we want to just use a bunch of point sources, so we don't have these big earthquakes with source complexity. 01:09:34.000 --> 01:09:38.000 We can see how just the synthetics look compared to the observations from these sources. 01:09:38.000 --> 01:09:44.000 So we simulate 20 of these moderate magnitude 3.5-4.5 earthquakes, using SW4, 01:09:44.000 --> 01:10:04.000 and we compare the synthetic waveforms with the real data via set of quantitative metrics that evaluate arrival time and intensity. So here's a clear demonstration of the change in waveform performance between zones this is a 01:10:04.000 --> 01:10:18.000 magnitude 4.0 event on the Hayward fault with some stations in Berkeley that record the event shown by the triangles and the recordings are shown here in black, and the synthetics are shown in [brief black screen] light blue and you can see that for example, stations to the West 01:10:18.000 --> 01:10:27.000 of the fault, like the station, BRK, [slideshows returns] the synthetic waveform matches quite nicely with the observations. But to the east of the fault, for example, at BKS the synthetics are late in the amplitude's too high. 01:10:27.000 --> 01:10:32.000 So we made the inference here that velocities in this particular region are not bad 01:10:32.000 --> 01:10:41.000 west of the Hayward fault, but they're probably too slow east of the 01:10:41.000 --> 01:10:51.000 Hayward Fault. An easy way to address something like this where we think maybe the velocities in one zone isn't too bad, but an adjacent zone needs attention is just to reassign 01:10:51.000 --> 01:10:54.000 those velocity depth rules that I described earlier. 01:10:54.000 --> 01:11:03.000 So we can kind of swapping in different choices of the velocity, depth rules to different zones that we have, and in that way it's easy to keep the velocities the same in areas where we think it's fine and to modify them 01:11:03.000 --> 01:11:16.000 in others, without actually touching the underlying, 3D geologic model without actually changing the actual, explicitly defined surfaces. 01:11:16.000 --> 01:11:31.000 Stored in there. And so this is what we did, and one of the main changes, for example, was to increase the velocities to the east of the Hayward fault, based on the mismatch that I showed in the last slide where I thought maybe the velocities might be too low. Other things included T 01:11:31.000 --> 01:11:51.000 decreasing the velocity in the Livermore Basin, where we saw that synthetic waveforms were arriving too high and dividing the zones along some straight lines, just so we could kind of set in a proxy for faults that are not actually existing in the model at the time. 01:11:51.000 --> 01:11:59.000 Now, here's a big collection of synthetics for waveforms going through the East Bay Hills, and that modeling study I'm showing here, 01:11:59.000 --> 01:12:08.000 The black waveforms are the observations; the light blue, are the synthetics using the old model, the versions, the O8 version and the red synthetics 01:12:08.000 --> 01:12:27.000 are using our updated version, which we call Version 21 without spending much time on this, you can clearly see that the synthetic waveforms improve from version to version in relation to the travel time, and they also removed some of these large reverberation type artifacts that we see a 01:12:27.000 --> 01:12:33.000 lot of the stations in the East Bay that are actually not there. 01:12:33.000 --> 01:12:39.000 So a summary of the improvements for this version, 21.1 we improved synthetic arrival 01:12:39.000 --> 01:12:40.000 times, peak amplitudes and cumulative motions in many parts of the East Bay. 01:12:40.000 --> 01:12:51.000 Livermore Valley and North Bay. These are kind of some improvement maps where green means it got better, and red actually means it got worse. 01:12:51.000 --> 01:12:56.000 So I encourage you to read the Hirakawa and Aagaard paper again, to learn more about this. 01:12:56.000 --> 01:13:03.000 Models are available on science base. We also released Version 21.0, which is the same as version, 01:13:03.000 --> 01:13:09.000 the 2008 version. But in this new Geo-model grid storage format that was developed by Brad Aaguard. 01:13:09.000 --> 01:13:12.000 So this is a new software used for storing and querying 01:13:12.000 --> 01:13:21.000 these raster-based models, and can handle topography, multiple geographic projections, and variable resolution. 01:13:21.000 --> 01:13:24.000 And it uses a widely used HDF 5 storage scheme. 01:13:24.000 --> 01:13:31.000 So you can take a screenshot of this slide to get these links. 01:13:31.000 --> 01:13:34.000 So now what I thought I'd do is just kind of jump around the model and highlight 01:13:34.000 --> 01:13:38.000 some of the special study regions that we're focusing on, in the sessions of this workshop. 01:13:38.000 --> 01:13:44.000 So we heard about the Greenville fault in the Livermore Valley area, and so in the velocity model 01:13:44.000 --> 01:14:04.000 Livermore Basin is represented by a large 20 kilometer wide basin, surround 7 kilometers deep at its deepest, but we found in our evaluation study that the velocities there were too high in the previous version. So here's a figure from the Hirakawa and Aagaard study, we 01:14:04.000 --> 01:14:10.000 found synthetic arrival times were coming in too early; shown by the blue curves, so we manually slow down the velocities here and it 01:14:10.000 --> 01:14:17.000 led to synthetics that arrive much closer to the observations. 01:14:17.000 --> 01:14:36.000 And there's still work to do here, including modifying the base in shape and also modifying that geology of the Mount Diablo region, which has been more recently mapped in better detail, and also to include from geologic maps very soft quaternary sediments that are mapped in detail but 01:14:36.000 --> 01:14:41.000 aren't explicitly in the model at the time 01:14:41.000 --> 01:14:43.000 We learned about the Santa Clara Valley in a previous session. 01:14:43.000 --> 01:15:00.000 So this is somewhere where we have started to more carefully add mapped quaternary sediments, and this is west of the Hayward fault and the Calaveras faults, and what we call the Central Bay Block where the shallow velocity structure is really over simplified and pretty homogeneous in 01:15:00.000 --> 01:15:13.000 The current model. And this is despite the fact that we have pretty detailed geologic maps that separate things like bay mud which I mentioned earlier from sand and alluvium, which might have much different properties. 01:15:13.000 --> 01:15:23.000 So I don't have time to really go into detail about how we did this, but the result is higher velocities in the alluvium and harder sediments, while we maintain the low velocities for the Bay mud. One result of this is to remove large 01:15:23.000 --> 01:15:28.000 unrealistic shaking from synthetics generated with the current model shown blue. 01:15:28.000 --> 01:15:35.000 Here, making a new synthetic shown in red look much more like the observations which again shown in black. 01:15:35.000 --> 01:15:41.000 So we plan to wrap this up very soon and hopefully release a new model update this year, version 23. 01:15:41.000 --> 01:15:47.000 And this is being prepared for a journal paper. 01:15:47.000 --> 01:15:50.000 I mean lastly, I think tomorrow we're gonna hear about the Geysers. 01:15:50.000 --> 01:16:05.000 The Geysers is kind of far out, close to the edge of the velocity model domain, but it is kind of in the realm of the North Bay, which I'm broadly grouping together, and I showed this in a previous slide where we updated this in the 2021 so the situation here that we 01:16:05.000 --> 01:16:11.000 found is that the current geologic model is really oversimplified has high velocity, 01:16:11.000 --> 01:16:21.000 Franciscan rocks spreading out way farther than it should, and completely overlooks the Sonoma volcanics, which we know is widespread in the North Bay. 01:16:21.000 --> 01:16:28.000 So in the 2021 update, we drew some kind of crude minds here just to bring some of the synthetics to look more like these observations. 01:16:28.000 --> 01:16:38.000 Just kind of cut up these blocks without actually changing the geologic model, and this got rid of some of these long reverberation-type artifacts that weren't in the data. 01:16:38.000 --> 01:16:45.000 But it would be nice to explicitly include some of these mapped geologic features into the geologic model itself. 01:16:45.000 --> 01:16:48.000 So this kind of leads into how I want to finish here. 01:16:48.000 --> 01:16:50.000 So what else can we do in the future of the model? 01:16:50.000 --> 01:16:56.000 We can refine the geologic model and get rid of some of these quick proxies like I showed on the last slide. 01:16:56.000 --> 01:16:59.000 We also have 3D services that exist that we should be able to theoretically just kind of drop into the geologic model. 01:16:59.000 --> 01:17:12.000 So here's an Eel River Basin model developed by Rob Grave is based on some cross-sections originally drawn in 1953 that we're using to model the Ferndale earthquake 01:17:12.000 --> 01:17:30.000 right now. Add info from other regional inversions like tomography and attenuation, even though these might not have those sharp boundaries, we might be able to find out more about some of the areas that aren't underrepresented info from active source experiments so 01:17:30.000 --> 01:17:48.000 there's a number of refraction lines that go across the East Bay, done by Rufus Catchings and Strayer et al. shows this, and then, of course, we have a ton of surface data, so we can reconcile the surface values with Vs30 or whatever other geotechnical info we have 01:17:48.000 --> 01:17:56.000 to add in the so-called geotechnical layer to the model, which we know has a big effect on ground motions. 01:17:56.000 --> 01:18:05.000 So in summary, the USGS Bay Area 3D geology and seismic velocity models are the most accurate community models in northern California. 01:18:05.000 --> 01:18:15.000 The primary uses in simulations the simulations are used ideally to predict ground motions, but we also use simulations to evaluate and update the model itself. 01:18:15.000 --> 01:18:23.000 Example, with these simple points source simulations. And there's a lot more data published and in progress that can help improve the model. 01:18:23.000 --> 01:18:36.000 So we're going to be excited to see how these can be incorporated in the future, and I'll just finish by saying that we have a workshop coming up in a few weeks Tuesday, February 14th, and you can contact me if you would like to attend 01:18:36.000 --> 01:18:42.000 this. Thank you very much. 01:18:42.000 --> 01:18:55.000 Thank you. Great comprehensive overview of that update on a complex model already, which is now even more detailed 01:18:55.000 --> 01:19:01.000 So that concludes the presentations. 01:19:01.000 --> 01:19:05.000 Part of the session. 01:19:05.000 --> 01:19:24.000 And will lead into our 30 min discussion section. There were a lot of questions and discussions on on the chat which now can be continued in person or more in the chat. 01:19:24.000 --> 01:19:54.000 I have a couple of questions for most, because but if anybody wants to raise a hand with directions, partner at the bottom, just click on it and raise your hand to ask something question to anybody, most of the time to to try that 01:19:58.000 --> 01:19:59.000 Perfect. 01:19:59.000 --> 01:20:10.000 I won't say anything for a little bit to encourage reactions 01:20:10.000 --> 01:20:14.000 Looks like Eric has questions. 01:20:14.000 --> 01:20:22.000 Yes, that's a new 3D. Geology and velocity is really an impressive. 01:20:22.000 --> 01:20:47.000 I wonder if the geometry of the of the heyward fault is still basically the same as in the previous model, so that you're only changing the you haven't changed the the geometry of the of the Hayward fault at all and it. 01:20:47.000 --> 01:21:15.000 The question was for me or, Yeah, no, wait. So in that in those updates we did not change any of the defined geologic services or false services, so that the geometry of that the same 01:21:15.000 --> 01:21:20.000 It looks like Alex had a question 01:21:20.000 --> 01:21:26.000 Hey? Yeah, this question is, for I guess Evan and and rest both things, both of you, for nice talks. 01:21:26.000 --> 01:21:28.000 I have a question about, obviously, I'll ask a question about faults. 01:21:28.000 --> 01:21:41.000 So could you use either of your models to infer the presence of faults that are unknown at the surface, like buried faults, or inferred falls part of a structure? 01:21:41.000 --> 01:21:47.000 Could you comment on kind of the utility of your models to cross check geology? 01:21:47.000 --> 01:21:52.000 Understanding of 3 dimensional faults at that 01:21:52.000 --> 01:22:00.000 Well, I don't think we could use our model a lot of our faults are downward, projected from surface faults. 01:22:00.000 --> 01:22:14.000 The only places. Excuse me dealing with a bad head. 01:22:14.000 --> 01:22:15.000 Yeah. 01:22:15.000 --> 01:22:35.000 Cold here. The only places we have blind faults are places where those have been imaged or 01:22:35.000 --> 01:22:36.000 Cool. 01:22:36.000 --> 01:22:37.000 To take a break. Yeah, no, I can 01:22:37.000 --> 01:22:44.000 Yeah, I'll pick up. I don't think it's when our 3D models there's really ways that we could discover new faults if that's what you're saying or infer. 01:22:44.000 --> 01:22:52.000 Locations of faults there, usually the result of people like, you know, going out and discovering the new folds. 01:22:52.000 --> 01:23:06.000 There are places where maybe a contact which is actually a fault contact in the real earth is kind of just represented as a depositional contact. 01:23:06.000 --> 01:23:11.000 So actually? Or do you? Rogers brought this up yesterday when we were looking at the Los Angeles fault? 01:23:11.000 --> 01:23:22.000 There's not really an explicit fault surface there in the model, but there's kind of a part where the bedrock dips down and looks like a fall kind of so there's areas like that where we can kind of AIM at. 01:23:22.000 --> 01:23:29.000 And and say, Okay, maybe that's a target. But we should. 01:23:29.000 --> 01:23:38.000 Use. What's there to build on this? But I think that's the the best direction that would address what you said 01:23:38.000 --> 01:23:43.000 Thank you. 01:23:43.000 --> 01:23:47.000 I see Laura has to stand up 01:23:47.000 --> 01:23:48.000 Yes, and it it's very much related on the same kind of topic as well. 01:23:48.000 --> 01:23:57.000 An extras mentioned so even, in particular, in your model, there you will have this place where you had to change the velocity profile. 01:23:57.000 --> 01:24:02.000 But you don't want to. You don't change the gee to me. 01:24:02.000 --> 01:24:06.000 Is a little bit weird, because shouldn't that be actually a motivation to see? 01:24:06.000 --> 01:24:10.000 Actually, the geology is different from what we're thinking there, and therefore use that to. 01:24:10.000 --> 01:24:17.000 We find Judge Korean France. In a way, that is what is the hangup about actually just changing that structural model that you're on as 01:24:17.000 --> 01:24:20.000 Yeah, so I mean, so the line of re, yeah, you're right. 01:24:20.000 --> 01:24:30.000 So the line of reasoning is that a lot of it surfaces and adept to the surfaces and the faults were constrained by kind of things. 01:24:30.000 --> 01:24:42.000 Other than the method we were doing. You know these double difference relocations, or in some cases geologic maps, or you know, the potential field stuff. 01:24:42.000 --> 01:24:47.000 So I think that it is a little bit dangerous to just change the velocities and say, Oh, we made the waveforms look better. 01:24:47.000 --> 01:24:51.000 So that means that this is better. Yeah, you're right. 01:24:51.000 --> 01:25:03.000 You should go back and change and re-examine the the actual geologic services 01:25:03.000 --> 01:25:05.000 So that's something that, yeah, we'll be kind of a future direction. 01:25:05.000 --> 01:25:14.000 But I think at kind of this resolution we we found that what we did was someone reasonable 01:25:14.000 --> 01:25:15.000 Hmm. 01:25:15.000 --> 01:25:21.000 We have that that leads to sort of today 01:25:21.000 --> 01:25:40.000 Maybe what's on many people's mind is is their plan to, you know, combine or use the post ranches. The updated coast ranges model from us to enorm an update of the Bay area model 01:25:40.000 --> 01:25:46.000 In many of these ways, with new falls and and and straight to grant. 01:25:46.000 --> 01:25:47.000 Okay. 01:25:47.000 --> 01:25:54.000 So right? Yeah, rest is the does the Coast Range model overlap with the Bay area a month or 01:25:54.000 --> 01:25:55.000 Okay. 01:25:55.000 --> 01:26:09.000 Yeah, it it a butts at the south end. The sorry sorry about the coughing fit, the the challenge would be the we've taken a slightly different approach to the subdivision of the above basement stratigraphy. 01:26:09.000 --> 01:26:14.000 So we would have to figure out how to make those meld together. 01:26:14.000 --> 01:26:25.000 We have some leverage there, in that. There's a lot of basement at the surface, so where there's basement at the surface, we don't have to worry about the above basement cause. 01:26:25.000 --> 01:26:35.000 There is none. Once we have the velocity to depth, relations developed for the coast ranges and have that. 01:26:35.000 --> 01:26:44.000 Integrated into the 3 Dg logic model. I think it would be relatively straightforward to compare along the edge 01:26:44.000 --> 01:26:50.000 And see how we see how it compares 01:26:50.000 --> 01:26:54.000 Once we once we have those 2 models stuck together. 01:26:54.000 --> 01:27:09.000 Then we have a relatively continuous model from the transverse range is up to Point Arena. I think 01:27:09.000 --> 01:27:14.000 So it would be a big volume to work with 01:27:14.000 --> 01:27:15.000 I'm good 01:27:15.000 --> 01:27:17.000 So there was 01:27:17.000 --> 01:27:34.000 There was some discussion in the chat also about the possibility of including fault phone structures in these models as a added layer of complexity. And maybe there's more to say about that 01:27:34.000 --> 01:27:37.000 Well, it depends on what we mean by falsehood structures. 01:27:37.000 --> 01:27:54.000 The Central Coast Range model has tried to capture those places where there are flower structures that have been that have brought basement rock as sort of downward, pointing triangular wedges. 01:27:54.000 --> 01:28:04.000 A long fault, we've tried to add a fault on either side, rather than having the single fault down the middle, so that we could have that lens of. 01:28:04.000 --> 01:28:30.000 5 Velocity rock in the model. But if we're talking about things like zones of low velocity due to rock crushing along the fault that would be a scale probably too small for a regional model, and this 01:28:30.000 --> 01:28:42.000 I I mean it would be possible to do. But someone would want to go in and do a more detailed start with our model and do more detailed modeling on top of it. 01:28:42.000 --> 01:28:53.000 Rather than try to integrate that concept across the A model that's under zoom miles long. 01:28:53.000 --> 01:29:04.000 I'd say the same thing in the Bay Area model, because, you know, like, so, a lot of default, planes, we're we're kind of divined in detail by the double difference earthquake relocations. 01:29:04.000 --> 01:29:12.000 But a lot of this actual spread of them might be somewhere as wide as the phone, so that we don't know. 01:29:12.000 --> 01:29:33.000 To add it into the geologic model would be pretty daring, I think, you know, but if on a case by case basis, you know, for a simulation or something, you wanted to add in a low velocity, fault zone, you could kinda just do that manually, but I don't think we're at the 01:29:33.000 --> 01:29:40.000 Point where we can with confidence add that into the community model that everyone's using 01:29:40.000 --> 01:29:46.000 We want to make sure to. Also, you know questions for some of the earlier speakers. 01:29:46.000 --> 01:29:48.000 And so, if there are some other questions out there for them, please raise your hand. 01:29:48.000 --> 01:30:01.000 There might be a few in the chat as well 01:30:01.000 --> 01:30:23.000 So one of the things that came up in the chat during Greg's talk had to do with whether these methods that he was describing would be helpful for finding more repeating earthquakes in also how the models that are the he has catalogs coming from the different 01:30:23.000 --> 01:30:29.000 methods might be validated against each other, and maybe you could say more about those topics 01:30:29.000 --> 01:30:31.000 Yeah, you want. You want me to speak to those 01:30:31.000 --> 01:30:35.000 Sure I think they were coming up in relation to your talk. 01:30:35.000 --> 01:30:47.000 Yeah, they did. And and so the the notion of doing a comprehensive search for repeating earthquakes using machine learning is a is an interesting idea. 01:30:47.000 --> 01:30:51.000 I have a student who's sort of doing that in his spare time. 01:30:51.000 --> 01:30:55.000 He he has an idea for how to do that, and I will see if it works or not. 01:30:55.000 --> 01:31:03.000 But we have, you know, the great catalog already for Central and Northern California to compare that with. 01:31:03.000 --> 01:31:12.000 So so that should go pretty quickly. And the second part of the question had these 2 part questions always throw me so? 01:31:12.000 --> 01:31:14.000 What? What was the second part? Jess? 01:31:14.000 --> 01:31:20.000 About validation of the catalogs that come from different methods 01:31:20.000 --> 01:31:21.000 Oh, can you hear me? Yeah. Yeah. Validation. 01:31:21.000 --> 01:31:29.000 Hello! Oh, validate. Yes. Okay. So I I can hear you I'm I'm in a hotel on the Internet without. 01:31:29.000 --> 01:31:34.000 Okay, so and one of the things about these deep catalogs, if you look at, say, the Ridge crest. 01:31:34.000 --> 01:31:46.000 So there are 3 or 4, or 5 d. Catalogs for Ridgecrest, the Ridgecrest Sequence developed by different techniques. 01:31:46.000 --> 01:31:50.000 And if you look at the earthquakes they have in common and the earthquakes they don't. 01:31:50.000 --> 01:31:53.000 There's a big population that they don't have in common. 01:31:53.000 --> 01:31:57.000 And so that's something we need to understand, at least for some things like like using them for e toos like earthquake forecasts. 01:31:57.000 --> 01:32:12.000 So we're, you know. We sort of have a a hypothesis that these different machine learning data science methods have different sensitivities they have in. 01:32:12.000 --> 01:32:28.000 We know they have inconsistencies, and how well they they pick even the same picker at different times and slightly different signals and when you're working down to the detection threshold you know there's a big population of earthquakes that are marginally detectable so we think that's what's 01:32:28.000 --> 01:32:38.000 Going on, but we're not sure so there's but there's a there's some interesting work, I think, to be done in that space, and then that's that's you know. 01:32:38.000 --> 01:32:46.000 Of course, related to validating that these are legitimate events in the catalog 01:32:46.000 --> 01:32:48.000 Okay. Great. Thanks. 01:32:48.000 --> 01:32:49.000 Yeah. 01:32:49.000 --> 01:32:54.000 Thank you with the question. Well, I see Alex has her hand up 01:32:54.000 --> 01:32:55.000 Yeah, I just post a question in the chat. And and Mr. 01:32:55.000 --> 01:33:01.000 Grendel has asked me to turn on my camera. So this is for Austin. 01:33:01.000 --> 01:33:17.000 Josie and Roland altogether. Us and Josie presented some great Usgs resources that compiled all of these creep observations from from the field, and Roland was talking a lot about other methods to compile creep observations. 01:33:17.000 --> 01:33:19.000 Is there like some database of sar observations? 01:33:19.000 --> 01:33:26.000 I know there's the the vertex tool that makes it so much more accessible to get a interference. 01:33:26.000 --> 01:33:42.000 But terms of on fault measurements through time. Is there some sort of I don't know if it's a collaborative database effort, or something to put it all together 01:33:42.000 --> 01:33:46.000 So the isn't exactly what what do you have in mind? 01:33:46.000 --> 01:33:54.000 Wednesday at no earth scope has a database of processed data products. 01:33:54.000 --> 01:33:58.000 And so data products and that could include quick time series. 01:33:58.000 --> 01:33:59.000 So so we don't have anything comprehensive. 01:33:59.000 --> 01:34:11.000 The the data that using produce partly, you know, funded by the Usgs external program. 01:34:11.000 --> 01:34:15.000 That is in a database. You know that via Sinodo that that's fully documented. 01:34:15.000 --> 01:34:23.000 Probably the raw data at Windsor and the creep data publicly available. 01:34:23.000 --> 01:34:30.000 But I I think it's a great idea to to to have that kind of system that's more comprehensive. 01:34:30.000 --> 01:34:37.000 That's a really good idea 01:34:37.000 --> 01:34:48.000 I know Justin Kai Johnson went through the effort of compiling a lot of creep observations for the Nshm Update that's been ongoing and they did an excellent job of compiling it. 01:34:48.000 --> 01:34:49.000 But it would be in in the interest of keeping all of these models as living, breathing models. 01:34:49.000 --> 01:34:57.000 It's it's an interesting idea to keep it rolling through time 01:34:57.000 --> 01:34:59.000 That's a great idea. 01:34:59.000 --> 01:35:03.000 So I noticed girl, in that you posted yourself a question in the chat. 01:35:03.000 --> 01:35:05.000 Would would you like to address your question? 01:35:05.000 --> 01:35:10.000 Well, well, you know the th. This is from the these wholesale creep rate changes, you know. 01:35:10.000 --> 01:35:18.000 We only became clear to us 2 days before the deadline of submitting the video. 01:35:18.000 --> 01:35:23.000 So so that made me feel like, oh, you know, if if I could have shown everything today that would be great. 01:35:23.000 --> 01:35:37.000 But anyways, am I able to share my screen? I want to show just really quickly a figure, if that's possible. 01:35:37.000 --> 01:35:52.000 So right. I showed you this comparison of creep rates from repeating earthquakes that Tuckogue produced on the Haywood fold northern southern, and then northern central color areas, and so on. 01:35:52.000 --> 01:36:05.000 A/C acceleration, acceleration. So this plot here on the left, essentially takes all the data of between the arrows. 01:36:05.000 --> 01:36:11.000 Here no! Then he would fault, so many would fold down here, and for the color various. 01:36:11.000 --> 01:36:12.000 The same thing. And so here, now, you nicely see how the Southern color is fault. 01:36:12.000 --> 01:36:32.000 The repeating earthquake data such as that deep acceleration all the way since the 1984 Morgan Hill earthquake on the northern part of the central centre is so the alum lock segment you can nicely see a decaying trend from 19 01:36:32.000 --> 01:36:40.000 84, 1988, and the 2,007 along Mark Earthquake, and then on the Hayward fold. 01:36:40.000 --> 01:37:00.000 You see that acceleration on the Southern Heywood fault in the Northern Haywood fault very nicely, and then bums, when we had magnitude 4 earthquakes in 2,007 2,011 2,018 that you see on the right so that gives sort 01:37:00.000 --> 01:37:11.000 Of a much better picture of of these temporal variations over the course of almost 40 years, that that seem very significant. 01:37:11.000 --> 01:37:19.000 And so then the question is right. Is this all related to the effect of these and other earthquakes? 01:37:19.000 --> 01:37:25.000 And you can strain rates that are reflected by the qui ball or is there something else going on? 01:37:25.000 --> 01:37:33.000 So I thought I I was excited by this, and thought I would share that with you. 01:37:33.000 --> 01:37:53.000 I I would just add, in the context of bringing these sort of near near field surface observations and the geodesy and everything altogether, there is a pretty strong complementarity naturally among the different scales of measurement that may help understand right. 01:37:53.000 --> 01:38:10.000 The processes. I think, Josie, alluded to this very well, that you know when you're measuring the group over 30 meters, it's different from over one or 200 meters, and it's different from insar pixels with a regional scale and there may be very different processes 01:38:10.000 --> 01:38:31.000 going on that modulate the the rates at different scales, different places along the fault. So it's good to have all the sets 01:38:31.000 --> 01:38:40.000 Good question for for Chosia 30. Thought I heard you mentioned that they number of creep meters is declining over the years. 01:38:40.000 --> 01:38:42.000 Is that right? 01:38:42.000 --> 01:38:48.000 It has over the last 30 years or so, just and I don't know the reasons for all of them. 01:38:48.000 --> 01:38:54.000 Some of them you know. We're never telemetered data. 01:38:54.000 --> 01:39:02.000 It was always manually made. And so those sites in particular are not sort of maintained in the record anymore. 01:39:02.000 --> 01:39:15.000 The real, you know, expert in this is John Langbine, who has did decades of work on the Crete meters. So if he wants to chime in on that, he's welcome to 01:39:15.000 --> 01:39:16.000 Yeah. 01:39:16.000 --> 01:39:21.000 Yeah, I guess so I'll chime in. Basically, it's personnel back in the 19 eighties. 01:39:21.000 --> 01:39:34.000 I guess there were. I don't know how many people were on the creek project, probably about 3 or 4, and then eventually I ended up being the master ceremonies arguably, probably in the late nineties and early 2 thousands. 01:39:34.000 --> 01:39:46.000 There are essentially 2 of us at times. It was just one of us, so it was basically a program. 01:39:46.000 --> 01:39:50.000 You know, there's just not enough people to go around like maintain these things. 01:39:50.000 --> 01:39:53.000 And we're arguably we're at this point now. 01:39:53.000 --> 01:39:54.000 Eric's, and normally the one who's supposed to maintain the Us. 01:39:54.000 --> 01:40:03.000 Gs stuff, and he's torn amongst a whole bunch of different projects. 01:40:03.000 --> 01:40:07.000 So that's where it is. 01:40:07.000 --> 01:40:10.000 No see there was. Oh, go ahead, John! 01:40:10.000 --> 01:40:18.000 That's yeah, are you weights? A strain meter programs do is having the same problem 01:40:18.000 --> 01:40:35.000 There was some mention in the chat, I believe, about the idea of Measuring Creek below San Pablo Bay, and I don't know if you have any commentary on on what might be possible, or in the works there 01:40:35.000 --> 01:40:41.000 Well, there are ideas in the works, and the kind of the first field visit is happening within the next couple of weeks. 01:40:41.000 --> 01:40:56.000 So it's kind of early days. But you know, Roger and and John and others have designed creep meters to be, you know, able to be flooded in their vaults. 01:40:56.000 --> 01:41:10.000 So the water isn't necessarily a prohibiting factor, and if Janet wants to chime in, if she's still around, that would be great but yeah, no, it's exciting. 01:41:10.000 --> 01:41:16.000 It'll be really interesting to to have a sensor there to see you know what's going on in the bay. 01:41:16.000 --> 01:41:19.000 The faults continue there 01:41:19.000 --> 01:41:22.000 You can give us an update next year. 01:41:22.000 --> 01:41:28.000 Yeah. 01:41:28.000 --> 01:41:32.000 Yeah, it looks like we have, maybe just a few more minutes left. 01:41:32.000 --> 01:41:43.000 If there are any remaining questions out there. 01:41:43.000 --> 01:41:49.000 It's there, like a systematic plan to have a additional phones with alignment. 01:41:49.000 --> 01:41:56.000 Array so creepiness. It seems like, you know, it's hard enough to maintain what there is already 01:41:56.000 --> 01:41:59.000 Yeah. So I'll speak to the alignment there is. 01:41:59.000 --> 01:42:09.000 There's been a in the last couple of decades, I think I mentioned this in the talk that there there's been a pretty massive expansion of the alignment arrays to sort of beyond Hayward and San Andreas. 01:42:09.000 --> 01:42:19.000 Faults, and so there are a lot more ways spread in the region, but that has to some degree outpace to the capacity of the people involved in the project to keep up with it. 01:42:19.000 --> 01:42:27.000 And so you see, when you look at the data that a lot of them have gone from sub annual sampling to well, just annual sampling. 01:42:27.000 --> 01:42:37.000 And you you. You know you lose a lot of resolution, of discrete events that are going on, which you can detect in the earlier years of some of the records. 01:42:37.000 --> 01:42:39.000 And oh, yeah. Jim retired for us to the program. 01:42:39.000 --> 01:43:00.000 Has some we're trying to breathe, some some new life into the personnel that are available to go and do these I know we faced some of these challenges with just trying to get enough people to go out and and visit the sites. 01:43:00.000 --> 01:43:12.000 In in recent years. And so, yeah, if people are interested in joining the joining the party on the project. 01:43:12.000 --> 01:43:22.000 So. Yeah, right now, Miss Ben, have this hand up, and can probably speak to this as well 01:43:22.000 --> 01:43:25.000 That's a final question by Ben. 01:43:25.000 --> 01:43:30.000 Hi, everybody. Yeah. Just comment on the San Pablo Bay Underwater Creek meter. 01:43:30.000 --> 01:43:33.000 We've been thinking about that for a few years. Todd. 01:43:33.000 --> 01:43:37.000 Ericson's an ocean engineer by trade, and we've been designing some going through the process of designing some instruments along talking with Roger about it. 01:43:37.000 --> 01:43:50.000 I think. Key fast in the chat about. If the location of the fault is well known enough in the bay, and Janet Watts work. 01:43:50.000 --> 01:44:07.000 It was a science advances article on that with a shallow seismic showed really clearly where the fault was heading north from Point Panel up towards up towards the northern portion of Santa Monica Bay, where it seems to take a sort of a right step and becomes less clear so 01:44:07.000 --> 01:44:11.000 In sort of the southernmost two-thirds of the bay. 01:44:11.000 --> 01:44:13.000 It seems pretty clear where it is, and we'd like to install more than one creep meter there to see if we could. 01:44:13.000 --> 01:44:28.000 You know image, shallow slip, propagating, though one of the advantages is that the water is very shallow, there at at low tide it gets to be, you know, a few meters. 01:44:28.000 --> 01:44:29.000 That's also disadvantaged, though to just in terms of potential traffic and and things hitting your your devices. 01:44:29.000 --> 01:44:42.000 So yeah, state. 01:44:42.000 --> 01:44:43.000 Great. Well, I think we're at the end of our time. 01:44:43.000 --> 01:44:52.000 Block here. So thanks everyone for really interesting talks and discussion. 01:44:52.000 --> 01:45:05.000 I don't know if there are any instructions that Sarah or John want to give about the breakout session that's coming up 01:45:05.000 --> 01:45:10.000 Well as always. Thank you to all the amazing speakers and moderators. 01:45:10.000 --> 01:45:18.000 That was wonderful. That concludes the official, workshop agenda for today. 01:45:18.000 --> 01:45:23.000 So you are all free to go, but if you would like to not go 01:45:23.000 --> 01:45:28.000 you can stay with us, and we'll go to have a special break out discussion on a SCEC statewide 01:45:28.000 --> 01:45:36.000 expansion plans, we will have a few short little talks, and then we'll talk about how we are 01:45:36.000 --> 01:45:40.000 going to coordinate and move forward and make sure it's the bestest expansion ever. 01:45:40.000 --> 01:45:59.000 So I say, take 3 minutes to go grab a cup of coffee, stretch your legs, do a little, dance in your own home, and let's start the quote... unquote breakout at 4:18 p.m., and you don't have to go anywhere.