WEBVTT 00:00:01.000 --> 00:00:08.000 Okay, thank you, John. So very short talk on the Ferndale earthquake 00:00:08.000 --> 00:00:14.000 Magnitude 6.4 on December 20, strong motion data. 00:00:14.000 --> 00:00:23.000 Okay, so this is basically a map with some general information about the strong motion records. 00:00:23.000 --> 00:00:34.000 At this time we have 163 stations record in the center for engineering is wrong motion data. 00:00:34.000 --> 00:01:00.000 If I zoom a little bit more in the closer area to epicenter the map shows the something that is very clear on this map is that stations to the east and northeast recorded stronger shaking at the map also show the finite fault model from a us geological 00:01:00.000 --> 00:01:01.000 Survey. So I I think there are several items. 00:01:01.000 --> 00:01:20.000 Consider why we have a higher acceleration in East and North is, but I don't get into that, and these are good items for a lot of discussions and research. 00:01:20.000 --> 00:01:34.000 But just you see on this map, Rio there, and Fortuna area recorded the maximum shaking, and this map shows the peak ground acceleration versus these times. 00:01:34.000 --> 00:01:40.000 No, the observation compared with ground, motion, prediction, equation. 00:01:40.000 --> 00:01:45.000 This is Boron, Atkinson equation. 00:01:45.000 --> 00:01:56.000 That's from some times ago the new comparisons will be available at the engineering data center soon. 00:01:56.000 --> 00:02:04.000 But just one. What I wanted to mention is that in general there was a good match of the acceleration and prediction. 00:02:04.000 --> 00:02:26.000 After we consider the finite phones model. But then, when we look at the velocity, we see that observation within 100 kilometers exceeds prediction, and well, I had a short discussion with normally Browns, and last week, and he reminded that well perhaps It's time 00:02:26.000 --> 00:02:31.000 To start using the local ground notion, prediction equations. 00:02:31.000 --> 00:02:43.000 Because, anyway, the questions that we have used so far are general, and it's time to get to some local predictions. 00:02:43.000 --> 00:02:46.000 Perhaps from now on. 00:02:46.000 --> 00:02:54.000 So, moving through the next slide. Okay, I'm not sure why it doesn't change. 00:02:54.000 --> 00:02:59.000 Okay? Bye. 00:02:59.000 --> 00:03:14.000 Yeah, okay, so let's see, yeah. So this map, I mean, you're talking about the maximum accuracy in Rio there are 2 bridges in real on a highway. 00:03:14.000 --> 00:03:22.000 One or 1 one at the Tail River bridge, and and the other is the painters. 00:03:22.000 --> 00:03:32.000 Most of them recorded very high acceleration, and this is the transfer layout of the Ale River. 00:03:32.000 --> 00:03:40.000 The maximum acceleration was recorded in this area that I'm pointing to clear system. 00:03:40.000 --> 00:03:48.000 It's a layout, also shows the free field. The Chinese that we have on the how about man? 00:03:48.000 --> 00:03:53.000 And then the record of the Pr. 3 in here the maximum was recorded. 00:03:53.000 --> 00:04:04.000 This location at Channel 7 that time, pointing tool. 00:04:04.000 --> 00:04:07.000 And you see 12.1 7 maximum acceleration. 00:04:07.000 --> 00:04:22.000 But clear here that there are some spikes on the record, and this could be due to the impact of the I mean the structural response. 00:04:22.000 --> 00:04:28.000 When we look at the the other bridge. This is the Painters Bridge the bridge is instrumented as well as we have a free field session. 00:04:28.000 --> 00:04:38.000 Close to the bridge here, and when we look at the record, yeah, awesome. 00:04:38.000 --> 00:04:46.000 There are folks observed on some of the panels of this bridge is on the apartment, . 00:04:46.000 --> 00:04:53.000 As well as on the the record of the free fifth. 00:04:53.000 --> 00:05:01.000 This is the free field, 1.4 4G. And it also shows some high frequencies. 00:05:01.000 --> 00:05:14.000 Oh, it could be that there are some! There's some impact of the the structure on the ground that we have calls such high frequencies. 00:05:14.000 --> 00:05:22.000 However, it was interesting that when you go to some other areas, for example, this map 00:05:22.000 --> 00:05:33.000 Shows the station in Fortuna area that is not close to any structure. 00:05:33.000 --> 00:05:45.000 Also we see some sparks on this record, so would that be anything related to the characteristics of ground motion? 00:05:45.000 --> 00:05:57.000 Not necessarily because of the impact of a structure. I think these are very good questions that I'm sure that's engineers. 00:05:57.000 --> 00:06:07.000 And seismologies that we look at these records and basically, that is a kind of a general. 00:06:07.000 --> 00:06:34.000 Summary of the the records of the main shock we had, and number of aftershocks also, with a strong motion data in the engineering data center and the the main, the largest the magnitude 5.3 on January first also has a set of very interesting records 00:06:34.000 --> 00:06:38.000 That would be really good to look at and learn from. 00:06:38.000 --> 00:06:41.000 I think there are some good lessons to learn from this. 00:06:41.000 --> 00:06:50.000 There's great, that's all I had 00:06:50.000 --> 00:06:59.000 Thanks I need. That was a Ferndale talk he wasn't able to join us on Tuesday, so we've added him in here to this session. 00:06:59.000 --> 00:07:10.000 So for the rest of our session which is a sort of our appetizers before lunch, just like last time, we'll have a series of talks on earthquake forecasting and after shocks. 00:07:10.000 --> 00:07:11.000 Then we'll have a pause for questions before diving into our second block of talks, which will focus on Earth. 00:07:11.000 --> 00:07:24.000 Break response. So we'll go to David Jackson for the next talk 00:07:24.000 --> 00:07:34.000 We know that the distinction between science and other disciplines is the scientific ideas can be tested against independent information. 00:07:34.000 --> 00:07:42.000 One way to express our ideas is to forecast the occurrence and characteristics of future earthquakes. 00:07:42.000 --> 00:07:50.000 A major goal of this current workshop is to plan one or more detailed forecasts 00:07:50.000 --> 00:07:55.000 There's several possible objectives to such forecasts. 00:07:55.000 --> 00:08:00.000 Let's get this going. There we go! 00:08:00.000 --> 00:08:09.000 Clearly one is to provide useful quantitative estimates of probable source characteristics for hazard estimation. 00:08:09.000 --> 00:08:28.000 We may also use forecast to test modeling of earthquake occurrence, based on fault, geometry, fault, activity, accumulated stress and strain, past earthquakes on and off faults and other observations, and of course we must publish papers and reports that explain earthquake 00:08:28.000 --> 00:08:40.000 Hazard to various stakeholders. These and other objectives are not identical, and it is worthwhile to weigh them in advance and consider how we can satisfy them. 00:08:40.000 --> 00:08:41.000 The map on the right illustrates the 2,013. 00:08:41.000 --> 00:08:52.000 You serve 3 rupture, forecast. It incorporates current scientific models, and is assumptions. 00:08:52.000 --> 00:09:02.000 It is designed for use in the Us. National earthquake hazard map, which is in turn incorporated in building codes and other possible documents. 00:09:02.000 --> 00:09:06.000 Thus the user 3, 4 format and its result were conditioned. 00:09:06.000 --> 00:09:12.000 On, avoiding rapid changes that might conflict with previous policies. 00:09:12.000 --> 00:09:24.000 These are the kinds of objectives that are best articulated in advance, planning for, for forecasting and probabilistic testing 00:09:24.000 --> 00:09:25.000 A landmark forecast was the 1,980. 00:09:25.000 --> 00:09:32.000 8. Estimate of probabilities of large earthquakes in California. 00:09:32.000 --> 00:09:42.000 Its objectives were to publicize the magnitude of the overall risk in California, and to announce recent ability to estimate probabilities of ruptures. 00:09:42.000 --> 00:09:51.000 On 16 segments of major faults. No test was planned, no executed by the authors of the forecast. 00:09:51.000 --> 00:10:06.000 After 35 years only the small squake of Park Field, 2,004 fit the forecast, while the 1989 lumber prize earthquake was ambiguous, plus the forecast satisfied. 00:10:06.000 --> 00:10:15.000 Only the first objective publicity and it illustrated the need for a test plan integrated into the forecast itself. 00:10:15.000 --> 00:10:19.000 The figure on the right shows a side by side. Comparison of 16 different forecasts producing 2,007 specifically for a test. 00:10:19.000 --> 00:10:29.000 A five-year-old it's called the Regional Earthquake likelihood model. 00:10:29.000 --> 00:10:35.000 All these forecasts include fault-based, smooth seismicity, beef value. 00:10:35.000 --> 00:10:49.000 There was a clear winner, and some forecast struggled, repeating the forecast and test over the successive 5 year interval showed very consistent results. 00:10:49.000 --> 00:10:57.000 So, including a prospective test strategy into the upcoming forecast is definitely advantageous. 00:10:57.000 --> 00:10:58.000 If you Google earthquake predictability, you'll be led to experts in this process. 00:10:58.000 --> 00:11:24.000 And here are some listed here. Thank you. 00:11:24.000 --> 00:11:47.000 We're talking today about 3 precursors that we observe before the 2,004 maintenance. 00:11:47.000 --> 00:11:54.000 There are 63 master templates that we use, and blue is 843 detective events, which is 13 times this many magenta. 00:11:54.000 --> 00:11:57.000 There are 385 relocated events. 00:11:57.000 --> 00:12:01.000 The red line shows the study background rate for each catalog. 00:12:01.000 --> 00:12:12.000 We, observing an acceleration and frequency of events in both detective, and we located catalogs one month, 2 weeks, and days before the curtains shot 00:12:12.000 --> 00:12:26.000 The locations of these and Matthew perpendicular non-call cross sections are again green, and the master events with the detective events agenda the relocated events in the Red Star is the perfume and I am close to the height. The center. 00:12:26.000 --> 00:12:27.000 Within one to 2 kilometers. We observe 8 new port shocks. 00:12:27.000 --> 00:12:36.000 We weren't in the original. How long that we're detected by the cross correlation 00:12:36.000 --> 00:12:44.000 We look at repeating 4 shots, only a catalog of 83 events again, cumulative number events versus days leading up to the mission. 00:12:44.000 --> 00:13:00.000 Red study background rate, you see more clear acceleration in frequency of earthquakes and months to weeks and days before the earthquake repeating earthquakes measure, slip like creep meters at depth, and repeating 4 shots, measure please precisely we look at individual clusters of 00:13:00.000 --> 00:13:06.000 the repeating 4 shocks. You have 8 individual, 4 shot clusters with 3 or more events. 00:13:06.000 --> 00:13:13.000 This show a study rate in 3 of the clusters, and an increase in frequency of events, and 5 of the clusters, locations of the refugee. 00:13:13.000 --> 00:13:28.000 For shot, catalog are blue, were doublet events and other colors, with 3 or more events in the cluster make clear, occurred significant distance away from the type of center 00:13:28.000 --> 00:13:44.000 If we look at the repeating the 4 shot clusters and scatter pot distances, events to the per field type of center versus days leading up to the main shock, you see, red line fits with negative trans and negative correlation, showing decreasing distance with time leading up to the main 00:13:44.000 --> 00:13:56.000 Shock, and it could be a migration of hypascenters in time or an increase in velocity between the repeating 4 shots and the for field type of center before the main shop nephew. 00:13:56.000 --> 00:14:13.000 We can look at these clusters and represent the data in another way, where colder blue colors primarily occur farther in time and space, from the main shop to the northwest and warmer red colors primarily occurred those are in time and space from the main shock to the southeast 00:14:13.000 --> 00:14:19.000 Where the type of center occurs. So at Parkfield, in 2,004, shall we observe tectonic tremor before? 00:14:19.000 --> 00:14:27.000 Is a previous precursor, and we observe 3 new precursors before the 2,004 earthquake with, Meet me. 00:14:27.000 --> 00:14:36.000 4 shocks observed close to the main shot type of center within one to 2 kilometers where there were no events before toleration and frequency of earthquakes. 00:14:36.000 --> 00:14:43.000 In the months, weeks, and days before the mess. But main shock and all the detective catalyzed, relocated catalogue. 00:14:43.000 --> 00:14:59.000 The repeating 4 shot catalog measuring preset individual repeating horschecks, clusters show that for 5 clusters and then 5 of the 84 shots near the park field and type of center, also show an acceleration which again, so figure out and then thirdly, the third precursor 00:14:59.000 --> 00:15:00.000 Is the migration that the center is closer to the main shock. 00:15:00.000 --> 00:15:07.000 With time, moving up to the main shock, which could be explained by possible velocity increase. 00:15:07.000 --> 00:15:37.000 It's not mobile. Thank you. 00:15:46.000 --> 00:15:52.000 Is someone gonna give John's talk 00:15:52.000 --> 00:15:54.000 Hey? We should skip to the next talk 00:15:54.000 --> 00:15:57.000 Would anybody like to make up a talk to go with John Slides 00:15:57.000 --> 00:16:01.000 Yeah. 00:16:01.000 --> 00:16:03.000 Let's go ahead and proceed to the next talk. 00:16:03.000 --> 00:16:04.000 Good idea. 00:16:04.000 --> 00:16:13.000 If John. 00:16:13.000 --> 00:16:23.000 Okay, so that was me, okay, so that we have a great discussion and talk for the machine learning, and can pretty much for the notion of for assessmentity in this workshop. 00:16:23.000 --> 00:16:26.000 So we are doing some things. Similar research project to upright a temporary match analyzes to the entire Northern California. 00:16:26.000 --> 00:16:32.000 Earthquakes. So we got the support from the Usds. 00:16:32.000 --> 00:16:37.000 Niharp 00:16:37.000 --> 00:16:45.000 I'm sorry it's moving to. Maybe I put 2 or 00:16:45.000 --> 00:16:54.000 Sorry it was the previous sync. And okay, I hope we you can see the what I see. Same thing. 00:16:54.000 --> 00:16:56.000 So what we do right now is only the 5 years period for can pretty much analysis to the entire notion. 00:16:56.000 --> 00:17:08.000 California, and thanks to the Usgs and analyst, so the our template matches no heavily based on the apps. 00:17:08.000 --> 00:17:14.000 First pick. And also I want to mention that so that we are using to Eq quaskan, python software. 00:17:14.000 --> 00:17:33.000 That's open software. Thanks for their developers. So the order process what we do is the this, if you qu scan python package, and I think we talked to our talk in the temperature march and the machine learning picking for the menacy no sequences in the first today, so we did the same 00:17:33.000 --> 00:17:40.000 Thing. So right now we are, find a true about the 2, 3,500 each event. 00:17:40.000 --> 00:17:59.000 So we are hoping to compare our result to the others, so that to make sure that what we're doing for it's reasonable to the appropriate and back to the entire Nosa California again, we are only doing for the 5 years right now and then also we don't do that location yet 00:17:59.000 --> 00:18:00.000 So all the new event we have is the we assign same location over the temporary. 00:18:00.000 --> 00:18:14.000 My temperate event, and we have computing with the amplitude, and then evaluating with the magnitude of the new event, so that's the how we can get to the histogram. 00:18:14.000 --> 00:18:19.000 You see this? This right? So the orange is the original catalogue. 00:18:19.000 --> 00:18:23.000 And then brutal is the what we added to the new event. 00:18:23.000 --> 00:18:30.000 So one thing I want to point out is that so? The ones we have a new event through the temperate march, and then we are using which to the new event as a temperate, and they once again to scan again so kind of iterative ways. 00:18:30.000 --> 00:18:41.000 So this is good, so we see that some event is populated up at the same time. 00:18:41.000 --> 00:18:45.000 So so far I don't know how how many time I have to do it, so maybe never ended. 00:18:45.000 --> 00:18:49.000 But after yesterday we see the one more iteration could be worthwhile. 00:18:49.000 --> 00:18:54.000 So that's the we are doing right now, and indeed we have to be expanded to the our time window. 00:18:54.000 --> 00:19:05.000 So right now, 5 years. So hopefully, we can do another 5 years, and then maybe 10 years is the our kind of the milestone goal 00:19:05.000 --> 00:19:19.000 So I don't think I want to mention that so repeating ask clicks so, Lauren Bogg man mentioned using to discolor, organize 3 and show the desire to 2 that yesterday, so that so far we have a 2 our best catarro one is the hayward of road 00:19:19.000 --> 00:19:23.000 Lead Roja Creek, and Aza is the Samoan Buddhist. 00:19:23.000 --> 00:19:24.000 The park field, so that's 2 area. So we have a so-called doing, for the repeating event. 00:19:24.000 --> 00:19:34.000 Detection for using tech invite from the original catalog. 00:19:34.000 --> 00:19:37.000 So, the map showing the to here is a desired king. 00:19:37.000 --> 00:19:43.000 The repeating event, whatever you see, the Karl circle is a repeating event with different cohency. 00:19:43.000 --> 00:19:47.000 So we have a 2 base map right now and then. 00:19:47.000 --> 00:19:51.000 Lastly, I want to point out, it's so that's relating to the previous talk. 00:19:51.000 --> 00:20:07.000 So once I want to combine these 2 together. So once you have a new event temporary to match arises, and some case happen to be these a new event is a limiting event the same as the previous stock, so that's the one example practice the event in the original catalogue to 00:20:07.000 --> 00:20:11.000 Oranges the event to event from a temporary amount. 00:20:11.000 --> 00:20:20.000 So using this one in it looks like this. 3 is a repeat in a second so we are hoping that you combine template to much result, to refine the our repeating event. 00:20:20.000 --> 00:20:29.000 Cutter. So hopefully it product. Our project is useful for the this community, and improve to the Northern California sisy. 00:20:29.000 --> 00:20:40.000 Thank you. 00:20:40.000 --> 00:20:46.000 Hello. Okay. Thanks for allowing me to speak here, and I'm gonna that my name is Rick Schoenberg, from Ucla. 00:20:46.000 --> 00:20:58.000 This is work with a former student of mine, Josh Ward, as well as 2 colleagues, Max Werner and Bill Sovereign. And I. 00:20:58.000 --> 00:21:15.000 I just wanted. I'm not gonna follow this outline on the left, but I just want to talk a bit about some results, some preliminary results not published yet from Csap, and that that I found kind of surprising, and maybe it'll surprise you, too, a little bit of background. 00:21:15.000 --> 00:21:34.000 As well. Many of you probably are well aware. The you know, earthquake forecasting has a history of a lot of failures in attempts at earthquake forecasting and earthquake prediction, and that has led to a lot of skepticism among many in in seismology toward all 00:21:34.000 --> 00:21:35.000 Probabilistic models. But I think that's going a little bit too far. 00:21:35.000 --> 00:21:47.000 What we really need, our ways of assessing these models so that we can tell, you know which ones are fitting well and which ones are not especially prospectively. It's not. 00:21:47.000 --> 00:21:58.000 It. History tells us it's not good enough just to rely on the literature to to and to look at results like. 00:21:58.000 --> 00:22:09.000 Oh, well, this model would have worked well in the past. We need to do actual prospective forecasting, and fortunately, amazingly, the 00:22:09.000 --> 00:22:19.000 The sizeology, seismological community has set up a a system for doing a very organized sequence of tests. 00:22:19.000 --> 00:22:37.000 Let me skip this one. I should mention the late wonderful seismologist, yon Kagan, and a good friend of mine, who sadly passed away last year, and he did so much important not only in the modeling of earthquakes, and for casting but also in in the assessment and 00:22:37.000 --> 00:22:44.000 statistical evaluation of models. So much important work was done by him, and we miss him very much. 00:22:44.000 --> 00:22:47.000 But this Csep is what I want to talk about. 00:22:47.000 --> 00:22:56.000 The collaborative study of earthquake predictability which was designed to be fully automatic and truly prospective in forecasting. 00:22:56.000 --> 00:23:00.000 It's really a wonderful from a statisticians perspective like mine. 00:23:00.000 --> 00:23:04.000 It's really an amazing achievement that this was set up so that people can enter models into the system. 00:23:04.000 --> 00:23:14.000 And we can see later how they actually performed. There's they're actually doing prospective forecasting. 00:23:14.000 --> 00:23:23.000 So I'm going to show you a little bit of results from data from 2,013 to 2,017, involving one day forecasts. 00:23:23.000 --> 00:23:42.000 So each of the models put into this forecasting system forecast in little grids what the expected number of earthquakes would be in each of those grids into the spatial temporal grids, and the idea is to see which models fit the best and i'll skip the methodology 00:23:42.000 --> 00:23:45.000 part. But here's some of the results in the ideas. 00:23:45.000 --> 00:24:02.000 Red is bad and basically light blue is good and on the left we have results for step, the step model, and on the right we have results for Etas, and it had been kind of generally thought that Etas was would do the best and was doing the best early on but we're finding that step actually does a little bit better. 00:24:02.000 --> 00:24:10.000 For data between 2,013 and 2,000 to 2,017. 00:24:10.000 --> 00:24:11.000 Here's another graphic that's supposed to show this. 00:24:11.000 --> 00:24:14.000 The white gaps are sort of problems with the model, and we see in in step. 00:24:14.000 --> 00:24:25.000 Compared to Etas. The points are a little bit more evenly distributed, which is a good sign for the model. 00:24:25.000 --> 00:24:32.000 Here, let me skip to the end results. It seemed that step was slightly more accurate. 00:24:32.000 --> 00:24:39.000 Overall for our data set, which was surprising. Etas maybe fit a little bit better in the northwest of California. 00:24:39.000 --> 00:24:47.000 But step fit a little bit better in most other areas. And the problem Stepped to be that Etas forecast outputs that were too high. 00:24:47.000 --> 00:24:56.000 They expected it expected too many earthquakes around places where earthquakes had happened recently relative to what was actually observed. 00:24:56.000 --> 00:25:01.000 So, yeah, that I thought that was kind of surprising. 00:25:01.000 --> 00:25:31.000 Maybe we should be looking more into step and or or combinations of these models going forward 00:25:35.000 --> 00:25:44.000 Max, are you here, or someone else? Just gonna give this talk 00:25:44.000 --> 00:25:45.000 Hi! Can you hear me now? Oh, wonderful. Okay, sorry. So let me just restart. 00:25:45.000 --> 00:25:51.000 Oh, we can't hear you, Max. Yes, I can hear you now. 00:25:51.000 --> 00:25:56.000 I'm a re mentional postdoc research statistician at Usgs. 00:25:56.000 --> 00:26:04.000 I'm off at field, and I'm going to be talking about aftershock forecasts, visualizing them based on user needs. 00:26:04.000 --> 00:26:10.000 This is joint work with the rest of the operational aftersh forecasting team at Usgs and a number of international colleagues. 00:26:10.000 --> 00:26:18.000 So the Usgs releases after chart forecasts in an automated system following large earthquakes. 00:26:18.000 --> 00:26:26.000 So we have a template product that shows our forecast, and it uses tables to show it. 00:26:26.000 --> 00:26:36.000 So this table shows the probability of at least one aftershock happening above a particular magnitude, cut off, and for the next day, week, month, or year. 00:26:36.000 --> 00:26:37.000 And so my research question asks, how can we represent this? 00:26:37.000 --> 00:26:38.000 Such a forecast in a graphical way, but effectively. 00:26:38.000 --> 00:26:49.000 That's the keyword. So, in a way that has the intended effect on our audience. 00:26:49.000 --> 00:26:50.000 So when th this question can be broken down into a number of pieces, where're interested? 00:26:50.000 --> 00:27:07.000 In which graphical types are most successful. So different kinds of charts, maybe maps or infographics, second, combine different kinds of visualizations, but which users are we targeting here? 00:27:07.000 --> 00:27:14.000 So we've typically designed around specialist us user needs like civil engineers or emergency managers. 00:27:14.000 --> 00:27:17.000 But we're releasing these forecasts publicly. 00:27:17.000 --> 00:27:23.000 So we might be interested in non specialists as well. What are the uses that these audiences take the forecast for? 00:27:23.000 --> 00:27:39.000 Well, this is a very long list. It goes from situational awareness, delivering information to others, managing limited resources, and who goes on and on, and then, finally, what are our actual communication goals as the forecast provider? 00:27:39.000 --> 00:27:45.000 Do we care more about the number of aftershocks, or communicating the probability that one will be damaging? 00:27:45.000 --> 00:27:51.000 Do we prioritize spatial or temporal patterns, or the uncertainty in the forecast? 00:27:51.000 --> 00:27:52.000 So I'm proposing a four-step research program to answer these questions. 00:27:52.000 --> 00:28:09.000 First we gather a number of users from target user groups and workshops where they do small group activities that elicit their needs for aftershock forecast information. 00:28:09.000 --> 00:28:18.000 We can then. September design forecast products around design user needs that we found in the workshops. 00:28:18.000 --> 00:28:25.000 We could then take a subset of those forecast products that we designed. 00:28:25.000 --> 00:28:32.000 So from maps to graphs, infographics, etc. 00:28:32.000 --> 00:28:37.000 And test them in a user experiment. So this is a task-based experiment where participants do tasks that are aligned with their user needs. 00:28:37.000 --> 00:28:55.000 So, which we identified in the workshops in Step one, and finally analyzing that experiment allows us to to quantitatively measure how well different products actually need user needs. 00:28:55.000 --> 00:28:57.000 Now, we'll be conducting this work in the Us. 00:28:57.000 --> 00:29:19.000 Mexico and El Salvador, with support from the Bureau of Humanitarian Assistance and part of the Us. 00:29:19.000 --> 00:29:30.000 Salvador. So stay tuned for updates from this project. Thanks 00:29:30.000 --> 00:29:36.000 Okay, thanks everyone for that. 00:29:36.000 --> 00:29:41.000 Questions for all of the previous speakers 00:29:41.000 --> 00:29:46.000 Please raise your hand. 00:29:46.000 --> 00:29:54.000 On, where? 00:29:54.000 --> 00:29:56.000 Did you go? Me, Bill? Great, great! Thank you, Hamid. I have a question for you. 00:29:56.000 --> 00:30:01.000 Yeah. Go ahead. Done. 00:30:01.000 --> 00:30:19.000 I'm I'm noticing similarities between the Ferndale earthquake, exceptionally high PGA at the highway, 101 Painter Street Bridge, and then the 2,014 Self Napa 6.0 Earthquake at the Crockett Cartenus 00:30:19.000 --> 00:30:30.000 Bridge. Both of those recordings were free field if I'm not mistaken, I'm just wondering to have you know any kind of correlation, you know, or thoughts. 00:30:30.000 --> 00:30:37.000 I guess what I'm asking for. Do you have any thoughts about those particularly high ground motions at South Napa? 00:30:37.000 --> 00:30:42.000 The bridge. Korquin is bridge, and then the Ferndale thanks 00:30:42.000 --> 00:30:47.000 When in general, I think there are these high frequencies. 00:30:47.000 --> 00:30:59.000 If we can divide them in 2 groups, one would be at the bridges, and in many cases high frequencies that bridges could be due to pounding impact. 00:30:59.000 --> 00:31:04.000 But I think, and that could be seen different places. 00:31:04.000 --> 00:31:10.000 However, the part that relates to ground motion, I think that would be something different. 00:31:10.000 --> 00:31:26.000 We observed in, in Ferndale area, as well as I remember, the 2,004 Park Field air switch that we had a PGA over 2 G. 00:31:26.000 --> 00:31:32.000 At this station for Zoom, 16. Very high frequency. 00:31:32.000 --> 00:31:39.000 No structure around. So there are. And I think someone in the chat reminded about some studies. 00:31:39.000 --> 00:31:45.000 That's what attribute this type of high frequencies to some source. 00:31:45.000 --> 00:31:50.000 The impact. 00:31:50.000 --> 00:31:51.000 Thank you. 00:31:51.000 --> 00:31:57.000 Sure. 00:31:57.000 --> 00:32:05.000 Other questions. 00:32:05.000 --> 00:32:07.000 Roland. 00:32:07.000 --> 00:32:13.000 Yeah, this is for David. David. You have thanks for this analysis of in providing more detail of the seismicity before Park Field. 00:32:13.000 --> 00:32:22.000 But when you refer to your observations, it's precursory. 00:32:22.000 --> 00:32:28.000 You you seem to be implying that they are anomalous from any other time, so we would have maybe had the chance to to see this coming. 00:32:28.000 --> 00:32:34.000 But but you can't really do that with having looked at only one month of data. 00:32:34.000 --> 00:32:52.000 Or do you think this, that this behavior stands out in a way that you know would have the promise of being recursory in the sense of recognizing an event to come 00:32:52.000 --> 00:32:59.000 Yeah, one of the catalogs I can't remember Detective catalog there. 00:32:59.000 --> 00:33:06.000 There was sort of a period of events that were on the red line, and then it sort of slowed down and then accelerated. 00:33:06.000 --> 00:33:10.000 So yeah, ideally, we'd like to look for much longer time periods. 00:33:10.000 --> 00:33:16.000 One problem is computation, time, it's already a lot of computation time for one month. 00:33:16.000 --> 00:33:18.000 So ideally, we have to go back and see how robust it is. 00:33:18.000 --> 00:33:34.000 And then, so yeah, it's it's a precursor in the sense of the month and the the changing so accelerating to the dates before the yes, in terms of robustness. 00:33:34.000 --> 00:33:40.000 If if there's fluctuations before, we don't know 00:33:40.000 --> 00:33:41.000 Cool. Thank you. 00:33:41.000 --> 00:33:48.000 What the velocity Change possible explanation for the migration hypothesis. 00:33:48.000 --> 00:33:57.000 Yeah, that might be a separate precursor by the way, or maybe the most undisputed precursors before earthquakes. 00:33:57.000 --> 00:34:03.000 But people debate whether there's any predictability associated with or shocks. 00:34:03.000 --> 00:34:04.000 Right. 00:34:04.000 --> 00:34:08.000 So I I call them precursors 00:34:08.000 --> 00:34:09.000 Well, I hope you get the computational resources to look at more data. 00:34:09.000 --> 00:34:13.000 That would be great. 00:34:13.000 --> 00:34:18.000 Yeah, thank, you. 00:34:18.000 --> 00:34:21.000 Sorry go ahead. 00:34:21.000 --> 00:34:34.000 Hey? So for Max Schneider, on the forecasting of after shocks, there's 2 elements that I could tell you a firing rescue, and the other response services are interested in our A. 00:34:34.000 --> 00:34:54.000 Is there a looking at the possibility of, or the probability of, earthquake main shock being followed by, an even bigger main shock, which make the first one precursor or the 4 shots, I guess, but looking at something more damaging after the main shock and cause for us we're considering 00:34:54.000 --> 00:35:09.000 constructs where we would potentially alert certain emergency response resources and possibly even pre-position, some under a menu, basically still to be determined. 00:35:09.000 --> 00:35:16.000 The second part is when we're responding to damage and collapse buildings after a main shot. 00:35:16.000 --> 00:35:27.000 Knowing, how how often, and how frequently, or how how strong, after shocks are likely to be while we're putting people in danger, conducting search and rescue operations and firefighting. 00:35:27.000 --> 00:35:33.000 So the work you're doing is excellent. There'll be more to come. 00:35:33.000 --> 00:35:37.000 We're looking forward to the next workshop 00:35:37.000 --> 00:35:43.000 Thanks a lot, Larry. I I took notes on all of that, and and it's quite an alignment with other needs. 00:35:43.000 --> 00:35:44.000 Expressed at the workshop last week. I know you couldn't. 00:35:44.000 --> 00:36:05.000 Make good. But one thing that's interesting is the idea that 4 shock probabilities are particularly useful because we we have lots of evidence. 00:36:05.000 --> 00:36:24.000 And and previous situations, where an an aftershock that was less in magnitude than the main shock actually caused more damage to an urban area than the main truck itself, and so we might want to start thinking less about magnitudes themselves. 00:36:24.000 --> 00:36:30.000 Then about shaking potential or forecasted shaking levels. 00:36:30.000 --> 00:36:40.000 Absolutely so all those factors, and that we want to really start digging in on what the science can tell us to to do these things like pre-positioning in some cases, or even what we calling earthquake box alarm responses. So there's more to come 00:36:40.000 --> 00:36:47.000 Yeah, calling earthquake 00:36:47.000 --> 00:36:51.000 Right, thanks. 00:36:51.000 --> 00:36:52.000 Okay, thanks everyone for your questions. We're gonna continue with the next series of talks. 00:36:52.000 --> 00:36:59.000 Now focusing. 00:36:59.000 --> 00:37:05.000 Hello! Can you hear me? 00:37:05.000 --> 00:37:06.000 Yes. 00:37:06.000 --> 00:37:07.000 Hello. Oh, good cause I could be about that. Let me in there. 00:37:07.000 --> 00:37:08.000 Hello! I'm man! Wine coordinator, hey? 00:37:08.000 --> 00:37:20.000 Why do you? Quick scenario? It was developed by hundreds of physical and social scientists, engineers, and emergency managers and urban planners. 00:37:20.000 --> 00:37:28.000 And together we produce these technical reports from which we derived fact sheets and movie Geo narratives and webinars. 00:37:28.000 --> 00:37:40.000 So Mike Benthin, of the Southern California Earthquake Center and Earthquake Country Alliance, will present the latest. The Hawaii scenario existed 00:37:40.000 --> 00:37:41.000 Okay. 00:37:41.000 --> 00:37:45.000 Thank you. Anne and we're really excited to be rolling this out over the next couple of months. 00:37:45.000 --> 00:38:10.000 It is bit based entirely on the incredible amount of of information provided within the a wired scenario itself, and a number of partners, as as the reports were coming out, saw this as something that could be the basis of different types of a discussion based or other exercises. 00:38:10.000 --> 00:38:31.000 Just all of the tremendous amount of information that is included in the scenario, and what that could, how that could be used by businesses and government agencies and nonprofits, and anyone who really wanted to to plan what might happen in a and plan their response for that type of situation and 00:38:31.000 --> 00:38:50.000 Really looking at this is not just for what the one day when an earthquake like this might happen, but also just when maybe more frequent challenges come up, you know, if you're if you're looking at the scenario, and what it says about loss of water, well, you might lose water service, for other 00:38:50.000 --> 00:39:00.000 Reasons too, and so you by exercising for this particular scenario, you're really looking at how you might meet more frequent challenges. 00:39:00.000 --> 00:39:12.000 And the toolkit is designed for any organization to plan, to lead and to learn from such exercises and and from the scenario lot of partners involved. 00:39:12.000 --> 00:39:27.000 I won't list everybody here and read through, but from the Bay Area and beyond, in the development of the toolkit and it's various components which include an overview of the scenario guidance for developing an exercise and the core of it is really these 45 situation 00:39:27.000 --> 00:39:52.000 Exercise ideas based entirely on the findings of the scenario and 18 of which have been really filled out with discussion of objectives, discussion, questions, and data, and even slides that an organization can just choose one of these don't have to go through and plan anything just take this off the shelf as it were and 00:39:52.000 --> 00:40:10.000 To have an exercise. We've taken the whole index of the scenario reports and agitated them also, for where you can find data for these topics, and really a a kind of the guide for having an exercise using the the haywire toolkit hey? 00:40:10.000 --> 00:40:14.000 Where scenario and and this toolkit. I'll just jump to that. 00:40:14.000 --> 00:40:20.000 We in our rollout plan we have some workshops coming up, and a webinar, and and also training the trainers for Chambers of Commerce. 00:40:20.000 --> 00:40:32.000 Staff, so that businesses can really use this, and that the members of those Chambers can can participate in that. 00:40:32.000 --> 00:40:46.000 So all the information you can find at earthquake countries that are supposed to be wired 00:40:46.000 --> 00:40:50.000 Hi all! Are you able to hear me? My name is Thank you. 00:40:50.000 --> 00:40:51.000 Yes. 00:40:51.000 --> 00:40:56.000 My name is Hebel. La Duke and I work for the California Office of Emergency services. 00:40:56.000 --> 00:40:57.000 Earthquake and tsunami and volcano program. 00:40:57.000 --> 00:41:05.000 And I'm here today to just talk a little bit about our statewide earthquake program, and how all of the work you do. 00:41:05.000 --> 00:41:11.000 We take and we apply into our emergency management work 00:41:11.000 --> 00:41:16.000 So, in addition to some of the projects if I skeptified, there we go. 00:41:16.000 --> 00:41:19.000 I'm in addition to the projects we have listed here on this slide. 00:41:19.000 --> 00:41:28.000 I wanted to talk about a couple projects that we're taking on where we specifically apply the science that all of you are working on into our emergency management projects. 00:41:28.000 --> 00:41:34.000 And we diversify our projects throughout preparedness, mitigation, respond to Anne. 00:41:34.000 --> 00:41:39.000 Recovery, and with our focus on equity and economic recovery. 00:41:39.000 --> 00:41:50.000 In addition to mitigation. So when one of our projects we're working on right now is a loss estimation library, we're using houses to develop this library. 00:41:50.000 --> 00:41:51.000 We worked with the California Geological Survey to identify scenarios for California counties. 00:41:51.000 --> 00:42:05.000 Using, the 2014 B. Ssc. Catalog in the Usgs shape map scenario shaped Mac website. 00:42:05.000 --> 00:42:08.000 We selected 87 scenarios that we will now use to develop. 00:42:08.000 --> 00:42:26.000 These hazardous runs and incorporate those into our library, that we will maintain this information will be used for future hazard mitigation projects, housing, mitigation, planning, and developing loss estimates during earthquake events, another project we're working on is the seismic 00:42:26.000 --> 00:42:33.000 Inventory of essential facilities. We began a pilot project of that this year in Humble and Del Mar Counties. 00:42:33.000 --> 00:42:34.000 We worked with consultants to conduct the Fema. P. 00:42:34.000 --> 00:42:48.000 154 level, one screening, using the rapid visual screening methodology, we completed screening at 201 facilities throughout the 2 counties for a total of 1,019 structures. 00:42:48.000 --> 00:43:10.000 Right, following the completion of that, screening the magnitude 6.4 humble earthquake occurred, so we had the opportunity to incorporate into this project some post event field opt surfaces which were then noted, and our consultants did share out their observations, during the earthquake 00:43:10.000 --> 00:43:20.000 Clearinghouse Reporting process. So they were able to find up on some of those screening that they did, and see how some of those structures performed during an actual event. 00:43:20.000 --> 00:43:38.000 So this information will then be shared with our local partners in both from Belmark Counties, which will be an opportunity and opportunity for them to follow up on additional screening and assessment of structures, and and then also future mitigation and retrofitting opportunities in those counties we hope 00:43:38.000 --> 00:43:45.000 to eventually expand this project out to additional county, since that we have a good inventory of these questions. 00:43:45.000 --> 00:43:52.000 So, in addition to those on the screen, we have several projects, as you've heard from Mark and DNA. 00:43:52.000 --> 00:43:56.000 Eca. We're trying to do some additional education outreach. 00:43:56.000 --> 00:44:09.000 We have a many awards program that we use, and we're able to implement non-structural mitigation throughout the State, which is really important, because that is one of the number one ways that people get injured during earthquake event. 00:44:09.000 --> 00:44:28.000 And also sorry also, working with the ray with coastsunami work group located up there on the north coast to update our living on shaky Ground Magazine, which is a great way to get information out about the Earthquake Hazard and ways that they can protect themselves during 00:44:28.000 --> 00:44:31.000 Earthquakes on the north coast. So we are right now in the process of collecting proposals for our Fy. 00:44:31.000 --> 00:44:43.000 23 mehr. Grant, so if any of you have proposals, we encourage you to send them to us and our next slide. 00:44:43.000 --> 00:44:47.000 I'll have a way that you can do that. You see our goals right here on this slide. 00:44:47.000 --> 00:44:50.000 So basically we're looking at ways that we can do some mitigation. 00:44:50.000 --> 00:45:00.000 That's really one of our big goals to help folks so that they can recover quicker from future activities or future earthquake events. 00:45:00.000 --> 00:45:09.000 And this here is ways to connect with us. So I again Amy, that Duke, and also Phil Labra, who is our earthquake program specialist. 00:45:09.000 --> 00:45:21.000 You can send any ideas to him, or if you have any questions, and again, along with your project ideas, we do need associated costs, and we need that by February the seventeenth. 00:45:21.000 --> 00:45:40.000 So again, if you have any questions, please reach out to Phil and his phone number and email address are there on this line? Thank you. 00:45:40.000 --> 00:45:45.000 Hi! I'm Elaine, and I'm gonna be presenting. 00:45:45.000 --> 00:45:59.000 The updates to the post. Earthquake response, application and data collection schema, that Thomas and Tim Dawson and I have been working on starting over the summer and then continuing throughout the year. 00:45:59.000 --> 00:46:06.000 So this is a wave collecting data during the rapid phase of data collection. 00:46:06.000 --> 00:46:08.000 After an earthquake. It's done using the Rgis field maps. 00:46:08.000 --> 00:46:15.000 Application, previous iterations of this were in art collected. 00:46:15.000 --> 00:46:34.000 But we've tried it over to field maps, and what's useful about this is it allows for rapid sharing of data that's being collected to other users and other teams as long as you have an Internet connection or it can be synced later on like in 00:46:34.000 --> 00:46:39.000 The evening, or when you have an Internet connection, and it also allows for uniform data collection. 00:46:39.000 --> 00:46:42.000 Everyone's collecting data with the same fields. 00:46:42.000 --> 00:46:51.000 So it's more useful. And this is basically an update for the schema that was developed during the rich press response. 00:46:51.000 --> 00:46:57.000 And you may have heard some folks talk about using using it during previous talks. 00:46:57.000 --> 00:47:14.000 Throughout the workshop, and what we're really hoping for is, if everyone could take a few moments to check out the updates to the schema and the application using the logging and information that's in the bottom left share of the screen there. 00:47:14.000 --> 00:47:21.000 We really appreciate it. So this slide here is showing what the data collection schema looks like. 00:47:21.000 --> 00:47:22.000 It's been streamlined since it's first iteration. 00:47:22.000 --> 00:47:31.000 And then the other sort of exciting or new thing about it is in field maps, is there's additional visibility. 00:47:31.000 --> 00:47:40.000 So when you are entering data, you only see the fields that are relevant to the type of data that you're entering. 00:47:40.000 --> 00:47:49.000 So if you're looking at a fault, rupture observation you won't see the fields for liquid action or slow movement, or anything like that. 00:47:49.000 --> 00:48:00.000 There's point observations line up observations and polygon observation, and each of those have different fields that are again relevant to that that type of observation. 00:48:00.000 --> 00:48:07.000 And then for some of these fields there's text entry, and some of them have pick lists or dropdown items. 00:48:07.000 --> 00:48:27.000 And really the goal here is to to balance efficiently and sort of streamlined data collection, so that folks in the field are as unencumbered as possible, while also balancing, getting all the information that we need and so yeah, if if you have time to provide some feedback. 00:48:27.000 --> 00:48:29.000 On the fields the way it's laid out anything like that that would be great. 00:48:29.000 --> 00:48:43.000 Kate Thomas is putting the information on how to do that in the chat, and just the last thing that I'll say is there will eventually be an instruction document that's annotated giving giving you how to do this. 00:48:43.000 --> 00:48:50.000 As well in the data dictionary. That will also be annotated, explaining what all the fields are, and if, the instruction document that's shown here in the QR. 00:48:50.000 --> 00:49:03.000 Code, and also linked in the chat. There's a survey for your feedback, and it's the only thing you do is go to the survey and answer the questions about what confidence means. 00:49:03.000 --> 00:49:09.000 If you're asked to, just the confidence of an observation or a measurement that would be hugely useful. 00:49:09.000 --> 00:49:15.000 So thank you. In advance for your time, and let me know if you run into any problem. 00:49:15.000 --> 00:49:20.000 My contact information is there, and Kate Thomas is also available to answer questions. 00:49:20.000 --> 00:49:30.000 Thanks. 00:49:30.000 --> 00:49:36.000 Hey? Good morning! This is Larry Collins, deputy chief for special operations and Hazmat at Cal. 00:49:36.000 --> 00:49:54.000 O, yes, I want to thank all the partners for being here today, and the great information this is, gonna tie back a little bit to the end users which include urban search and rescue and firefighting using tools like the earthquake early warning the 00:49:54.000 --> 00:49:57.000 Shake math aftershock, forecasting, and the other programs that have been mentioned today. 00:49:57.000 --> 00:50:10.000 In the in the end, from the perspective of the responders, we're trying to prevent things like firefighters and other rescuers being trapped in their fire stations and other facilities. 00:50:10.000 --> 00:50:12.000 That's earthquake, really warning. And so many automated systems we're trying to apply statewide. 00:50:12.000 --> 00:50:22.000 There and then in our damage surveys, data collection prioritizing, based on the shape maps and other data that we can access. 00:50:22.000 --> 00:50:34.000 Hopefully, early on in the quake response, followed by our actual search rescue operations. 00:50:34.000 --> 00:50:35.000 Considerations like risk versus game, which includes aftershocks, and secondary structure collapse. 00:50:35.000 --> 00:51:04.000 The fact that we are probably going to be engaged in the catalogue events for weeks, rescuing people and doing the recovery operations followed by, you know initially our responders, including single resources going out and beginning operations, followed by our California regional use our task forces that we come in to 00:51:04.000 --> 00:51:13.000 Play for larger building, collapse, search, rescue operations, assisted by the 8 California Fema task forces of the State of California. 00:51:13.000 --> 00:51:22.000 28, nationwide, and in some cases in our catastrophic events, including international urban search and rescue teams assisting. 00:51:22.000 --> 00:51:24.000 So the 5 stages of collapse, search, rescue that you may be seeing right in front of you may not realize the methodology. 00:51:24.000 --> 00:51:37.000 Are as follows, includes the response, arrival. Recon to a building collapse or scenario like that. 00:51:37.000 --> 00:51:46.000 They initial search of lightly trap people. 00:51:46.000 --> 00:51:52.000 That stage 2. Then we go into stage 3. Voice based search. 00:51:52.000 --> 00:52:03.000 That's where the firefighters and rescuers are actually crawling through the buildings tunneling, breaching, maybe for weeks or days in danger, and that's where we go back into earthquake early warning. 00:52:03.000 --> 00:52:23.000 Potential, the the after shock forecasting, followed by, You know this, this, this stage 3, would include all these aspects trying to find people that are trapped in survivable voice, spaces, and that goes to the drop cover and whole box instruction. 00:52:23.000 --> 00:52:28.000 We've been given to the public public. So they end up hopefully in a survivable void space. 00:52:28.000 --> 00:52:34.000 Even if the building collapses around them, followed by extracation. 00:52:34.000 --> 00:52:38.000 In this case, this was a one of our operations in the Haiti earthquake. 00:52:38.000 --> 00:52:54.000 Just an example followed by Stage 4 selected debris rubble taking buildings apart, dissecting buildings, looking for more survivable voice spaces and more victims, and in the end we end up with general de rear rule, and in all those phases of urban search and 00:52:54.000 --> 00:52:57.000 Rescue we're going to be relying on the data that is being provided by these researchers that have been speaking today. 00:52:57.000 --> 00:53:09.000 And the tools you've been providing to make it more efficient, more safe, and better for the people we're trying to rescue and help. 00:53:09.000 --> 00:53:31.000 Thank you very much for your time. 00:53:31.000 --> 00:53:39.000 Jeff, you're up 00:53:39.000 --> 00:53:42.000 So, I'm Jeff Adams, and a little about me. 00:53:42.000 --> 00:53:47.000 I'm sitting in the in the cab of my truck right now, and I hope you can see me. 00:53:47.000 --> 00:53:50.000 I don't know if you can't account, but I'm triple book. 00:53:50.000 --> 00:53:58.000 So I was actually boring. Born during the 1964 anchorage earthquake. 00:53:58.000 --> 00:54:06.000 I came into this world during that event, in 2,010 my house was ripped in half my vacation home on in the earth. 00:54:06.000 --> 00:54:10.000 The earthquake. That happened 2,010 Easter Sunday, 2,019. 00:54:10.000 --> 00:54:18.000 I got sent to Puerto Rico to deal with their earthquake, and in 2,022 I was sent to Ferndale, and the aftershocks. 00:54:18.000 --> 00:54:20.000 So what I'm going to talk to you about is the data collection that happens in the recovery phase. 00:54:20.000 --> 00:54:32.000 Following Larry's talk of response. So most of you likely already know. And hopefully you can hear me. Just somebody give me a thumbs up 00:54:32.000 --> 00:54:37.000 Good got it. So well. What we do post earthquake is we engage a group of engineers typically in California. 00:54:37.000 --> 00:54:48.000 They're working for either Dgs or or Caltrans. 00:54:48.000 --> 00:54:49.000 My job is to coordinate them and get them ready to use the Atc. 00:54:49.000 --> 00:54:58.000 20 format, and perform the the the safety assessment program, using Atc. 00:54:58.000 --> 00:55:05.000 20 that used to be done in a paper format. 00:55:05.000 --> 00:55:23.000 We now do it, using an Esri application through survey 1, 2, 3, and what that does is allows the inspectors to use their smartphones and go down the the form based application, and it follows to a T the Atc. 00:55:23.000 --> 00:55:31.000 24 format. And what it does is allows a grainer viewing spectrum on a dashboard if you will. 00:55:31.000 --> 00:55:36.000 So our task is, is, first firstly, to get there and then perform the the the safety is the evaluations of buildings. 00:55:36.000 --> 00:55:49.000 Task is to inspect an effective area purpose is to provide the number of damaged and destroyed buildings in the affected area going forward. 00:55:49.000 --> 00:56:03.000 So the the the payment process. We can get start getting our arms around what we're gonna do with the folks who are who are severely affected in their homes are not able to be lived in, etcetera. 00:56:03.000 --> 00:56:12.000 So we categorize them, using the the forms in Atc 20, and they're either getting a green, a yellow, or a red placard on their building again. 00:56:12.000 --> 00:56:17.000 This is captured in an electronic medium these days. 00:56:17.000 --> 00:56:22.000 And as we go forward we have a need and a want to engage more with your group. 00:56:22.000 --> 00:56:36.000 Of very smart people. I am very impressed, and and find out how we can make what I end up doing on the far end of what your studies tell us. 00:56:36.000 --> 00:56:42.000 That's coming, how we can be better. So with that we set it up, and we use caloes. 00:56:42.000 --> 00:56:50.000 It. They provide user names and passwords for these people, to use their handheld applications. 00:56:50.000 --> 00:57:00.000 It's all captured and exportable spreadsheet, and it can go to many different people again, using the green, yellow, red placards that show up. 00:57:00.000 --> 00:57:03.000 I decide what the rules of engagements are. The work schedule. 00:57:03.000 --> 00:57:08.000 Give them a safety. Briefing talk about the weather, make sure they have the Ppe. 00:57:08.000 --> 00:57:29.000 They need, and then we make a notification through the affected city going forward, and and they do a public service announcement, letting them know that th that strange people where they look like Cal trans people are gonna be, you know, easily addressed wearing something official and professionally done are going to be coming to 00:57:29.000 --> 00:57:35.000 their house. To perform. This individual assessment or inspection. 00:57:35.000 --> 00:57:51.000 It is a little tedious on the front end, but with, you know, when you when you get a couple of sets and reps, and by the way, we went twice to the Ferndale earthquake, we went to Rio del first I got there on the 20 s of December and 00:57:51.000 --> 00:57:59.000 Then the aftershock hit again. That has been mentioned several times on the first of the year, and we went back and took us another week. 00:57:59.000 --> 00:58:02.000 So that's basically what we're doing and how we do it. 00:58:02.000 --> 00:58:07.000 If anybody has any questions on bumping time. So I'm gonna I'm gonna stop there. 00:58:07.000 --> 00:58:14.000 But let me know if you have a question I can answer. Please thank you. 00:58:14.000 --> 00:58:18.000 Hey? That's wraps up our sessions. 00:58:18.000 --> 00:58:33.000 If anyone has questions for the previous set of speakers, please ask them now raise your hand, and I'll call on you 00:58:33.000 --> 00:58:36.000 Jay! Go ahead! 00:58:36.000 --> 00:58:37.000 Hi, this question says for for Jeff, so I am my colleagues have gone through that Atc. 00:58:37.000 --> 00:58:51.000 20 training? Is there a website where we can learn about how the survey? 00:58:51.000 --> 00:59:02.000 1, 2, 3, app, and how to use that, and proceed with helping out when we are called to help out 00:59:02.000 --> 00:59:03.000 Yeah. 00:59:03.000 --> 00:59:04.000 So jay there's there's several options here. 00:59:04.000 --> 00:59:11.000 So what we typically do because I work for oes and Oes is the only agency that can mission task other state agencies. 00:59:11.000 --> 00:59:12.000 Got it 00:59:12.000 --> 00:59:15.000 If you're affiliated with one of them, we can. 00:59:15.000 --> 00:59:26.000 We can work down that path. Otherwise it's difficult to use civilians and bring them in, because there's a vetting process that we typically don't have time to do with that, sir. 00:59:26.000 --> 00:59:28.000 If you take my email address I'll give it to you very quickly. 00:59:28.000 --> 00:59:29.000 Okay. 00:59:29.000 --> 00:59:36.000 It's jeff.adams@caloesca.gov. 00:59:36.000 --> 00:59:39.000 We can take this offline, and I can help you 00:59:39.000 --> 00:59:43.000 Alright! Thanks. 00:59:43.000 --> 00:59:47.000 It just as a sidebar. So Jeff works for me, and with me. 00:59:47.000 --> 00:59:54.000 Jeff and I both were assigned to help with the Puerto Rico earthquake, and twenty-twenty we spent. 00:59:54.000 --> 01:00:04.000 Like a month there, basically coordinating the structural engineers and the others, including mutual aid from other states that came in to help with structural engineering evaluations, using Atc 20 using the same system. 01:00:04.000 --> 01:00:14.000 Jeff discussed. Some of that are red. Some of that data capture started with the urban search and rescue discipline. 01:00:14.000 --> 01:00:17.000 And we've been expanding and using other ways. I just saw the other presentation where we're using good dynamic capture for postquake. 01:00:17.000 --> 01:00:23.000 Other issues like Strike, you know, fault, rupture, and so forth. 01:00:23.000 --> 01:00:33.000 So this is part of the common operating picture we're trying to establish in California and elsewhere to consolidate all the information, to see lives and then work beyond that. 01:00:33.000 --> 01:00:45.000 And so what Jeff's presenting the fact that we're observed to rescue specialists from firefighting. 01:00:45.000 --> 01:01:00.000 But we were faculty helping the coordinate calendars and other engineers doing that sap work is just another brid example of Britain that we're doing 01:01:00.000 --> 01:01:03.000 Thanks. 01:01:03.000 --> 01:01:13.000 Any other questions for our speakers? 01:01:13.000 --> 01:01:27.000 We are a little bit behind schedule, so if nobody jumps out, I'm probably gonna say, let's go have lunch.