Jump to content

Predictive flood prevention mathematical model that relates weather forecasts to water level at a river gauge(help needed)

Hey Guys

As a side project to my university degree I have dreamed up this project that I'd absolutely love to bring to fulierision, however, there are aspects of the project that I will need some guidance and probably some help.

The project Centres around developing a State space module of the river basis, relating the input (rainfall in the region) to the output (The hight of the river at a particular gauge) see figure1. this can be done Using system identification in Matlab and the likes which I'm well versed in

 

 

image.png.50385d30d796c685cd815cdca8963cb6.png

 

the difficulty I am having is that I don't know where to start in getting the data in a form that is usable, My current plan is to try to develop something that takes the forecasted weather (https://www.met.ie/) from the rainfall forecast map then quantise this using a grid that is approximately the size of the river basin a visual replication is shown in figure 2 (excuse the awful drawing ?)

image.png.96c1ffb3f992cf73adbdf431edd3544d.png

 

Once that data is gotten in terms of a value that represents the value of the square (part of the research will be determining if mean, median, max, or whatever methods influence on the accuracy of the model ) it can be passed into the model shown in figure 3

 

image.png.18263dd527468e188e28d9fdc8ce1a29.png

 

once data is gathered for the input and output the matimatimatical can be obtained using some of the Matlab system identification toolboxes

my questions to you's that are more experienced programmers than myself are as follows

  • what is the best way to try to strip the data from the met Eireann website https://www.met.ie/
  • is there a way of getting the data from the webpage itself before it is drawn, or would some image recession be better
  • what languages, libraries etc. should I look into because I don't have the time, experience or, the patients to do it from scratch unless its relatively simple

 

something like this has been done before[1] however they used upstream gauges, so the time frame is much shorter, whereas this approach can directly relate the forecast weather to the river hight

 

thanks for your help in advance people, and if you have any questions or you just find it interesting and want to know more drop me a message or post below

thank you

 

 

 

 

[1] https://ieeexplore-ieee-org.jproxy.nuim.ie/document/8064947

 

Link to comment
Share on other sites

Link to post
Share on other sites

You could do it in Python with Keras, sklearn, numpy, and pandas (all free/open-source).

Probably HTML.Parser for the website information.

Link to comment
Share on other sites

Link to post
Share on other sites

thanks for your reply

I'm coming from the point of view of an electronic engineer and only have experience in embedded stuff 

could you direct me towards some relatable examples that would be so helpful as i have basically zero experience in this area 

thanks

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Ciano said:
  • what is the best way to try to strip the data from the met Eireann website https://www.met.ie/
  •  is there a way of getting the data from the webpage itself before it is drawn, or would some image recession be better
  • what languages, libraries etc. should I look into because I don't have the time, experience or, the patients to do it from scratch unless its relatively simple

Before it's drawn to the screen you cannot get it "directly".

From the image once drawn it's possible but tedious.

 

their map use OpenLayer which is a well known great mapping tool to competition google maps. I used it quite a lot. There is some API method to gather data from it. Have they exposed those features to external user that you can only know if you ask them. Having access to that would give you access to generate the picture without having to load the full page.

 

That being said and knowing they use OpenLayer that mean they have a data model plugged in on their website so it's accessible somehow. Maybe just private but accessible. The best thing would be to call/email them and ask if the data has pull request you can do or if it's available somewhere. Sometime it's just a secondary page with text file data. Having the data directly completely remove the use of generating the image. They might give you access to a path where they have Excel, CSV files with all the weather data.

 

At last but not least if they do not offer this you have still 2 choices. Either update the page to the proper location and parse the image which is tedious or contact your National Weather association. I have dealt with many of those organisation around the world and the only one that i have dealt with that were not free to view for public eyes are some places in Russia and China.

 

In the US that be NCDC, Canada would be Environment Canada, Europe "was" (i think it changed since last time i played with these) the GFCS.

 

They offer most simple data for free to the public but you need to ask them for access (usually you email them and they send you a username/password). Once you have access you usually have all data for the last 100+ years for US/CAN, Europe had only 25 years back then. Plus you have live data you can do X amount of request per day. If you want very advance data that are restricted then you need to provide a company number and reason to use. Pretty much all of the data you can see on the different weather website are part of the free public info. Restricted access has stuff that just to spell the word you need at least 4 degree's.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Ciano said:

thanks for your reply

I'm coming from the point of view of an electronic engineer and only have experience in embedded stuff 

could you direct me towards some relatable examples that would be so helpful as i have basically zero experience in this area 

thanks

 

The website for the libraries explain how all the commands work.

 

https://www.numpy.org/

https://scikit-learn.org/stable/

https://keras.io/

Link to comment
Share on other sites

Link to post
Share on other sites

thanks  for your help there is some useful insight in there 

 

that is the Irish national weather forecasting website so yeah if your saying that they tend to be accommodating in terms of data sharing that would be a good approach.

this will probably aid in making the model more accurate also because of  the increased number of inputs  to the state-space equations 

23 minutes ago, Franck said:

contact your National Weather association

it seems that you have past experience in stuff like this. in terms of life data updates that they provide you, generally speaking, how is this done so and what ways of interfacing with the data and the developed mathematical model so that it can push the output data to a web page like the following http://www.riverspy.net/opwgauge.html?code=07005 (im friendly with the guy that runs this site  ) and a 2nd prediction graph can be drawn 

 

thanks again 

Link to comment
Share on other sites

Link to post
Share on other sites

The data i have gotten in the past were 90% of the time excel files and csv for older data (you need old data to build your future model usually).

For live data it's usually a web service that is easy to parse as csv or xml. You do have to provide some inputs as you cannot grab every single data of the world. You wouldn't have the storage anyway. Most are example www.[WhateverSite].blablabla/WeatherData?Longitude=X;Latitude=Y;range=X and then it return you a readable CSV/XML with all weather stations and their recent sensor outputs listed by time. Some data require other call like asking deeper data on station "WS1055" for what are carbon like at X altitude and things like that.

 

If you look at the example you just provided the link it is hooked to the CSV data on another site which is great and much more readable.

http://waterlevel.ie/data/week/07005_0001.csv

If you look at the entry it has a 15 minute pull rate on this data. This is great, most of those i have worked with are per hour and especially for something less common like water level. Usually 15 min pull rate and less are only for very common values such as DB,WB,GR,Wind direction,Wind speed, Ceiling level, Precipitation. Unless that is a danger flood zone i guess their equipment must generate more data for emergency reasons.

 

I have only dealt with Live data for US and Canada. The rest of the world i only used the old data, it was enough as we used it 1 out of 100 jobs. I was sending my request each year to get the latest year of data to each national weather organization and updating my databases.

 

Data NEVER include models. Data is data. Simple value read from instruments. All weather channels / offices uses these data to interpret their own models. If you happen to be able to get access to data from the local weather station then i guess they could provide the whole model is they wanted.

Link to comment
Share on other sites

Link to post
Share on other sites

ok perfect, going forward ill try to get access to there model, as Developing my own for the weather will probably result in an oversimplified linear model and would distract from the main aim of correlating the rainfall and the hight of the selected river

10 hours ago, Franck said:

If you happen to be able to get access to data from the local weather station then i guess they could provide the whole model is they wanted.

in terms of the "riverspy" web site I'm aware that gets its data from the opw (office of public works) waterlevel.ie website, My question is more so directed at once I have found a working model and its "predicting" the future level, how do I push this to that website and draw it  or would it be better for the website to implement it directly 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Ciano said:

ok perfect, going forward ill try to get access to there model, as Developing my own for the weather will probably result in an oversimplified linear model and would distract from the main aim of correlating the rainfall and the hight of the selected river

in terms of the "riverspy" web site I'm aware that gets its data from the opw (office of public works) waterlevel.ie website, My question is more so directed at once I have found a working model and its "predicting" the future level, how do I push this to that website and draw it  or would it be better for the website to implement it directly 

 

You want your model on their site ?

 

For that you need access to their website and code the page there or they need to code it and you need to provide a web service they can pull the data from.

 

I assume since you want it on their site that you have somehow ways / credentials to do so.

Link to comment
Share on other sites

Link to post
Share on other sites

that would be the end goal. I'm friendly with the guy that developed that website and he's more than willing to give access to it. You put it better than myself, would it be better to develop a web service for it to pull data from, or with the given access code it into the site is the question. With the main goals being able to have the freedom to adapt the model over time without having to overhaul the thing each time. What are the advantages/dissadvanta\ges of each approch

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, Ciano said:

that would be the end goal. I'm friendly with the guy that developed that website and he's more than willing to give access to it. You put it better than myself, would it be better to develop a web service for it to pull data from, or with the given access code it into the site is the question. With the main goals being able to have the freedom to adapt the model over time without having to overhaul the thing each time. What are the advantages/dissadvanta\ges of each approch

the advantage of making a service is that you supply the format of the data with input parameter. This standardize the way to get the information and allow any language to use it in the future. You can consume a web service with any language that support HTTP request. But you have to know if the website host support service hosting and what kind.

 

It's much faster is you don't really know the subject to code int the page itself and that's it. It's always possible to change that in the future anyway.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×