PDA

Click to See Complete Forum and Search --> : Middle tier on a flash site w/ XML?



murple
06-21-2001, 01:54 PM
I'm working on a flash site that is database driven and i'm faced with an increasing problem where the database server goes down and my site pretty much becomes unusable until the admin wakes up from his nap and reboots the DB. I've heard of sites using XML to essentially copy the contents of a database into a file or series of files, for searches to be run on. The prinicple i guess is to reduce the number of database calls to only updates and deletions. PHP is at my disposal and i've been able to parse XML files, but, any ideas as to how to create this middle tier without eating up more resources scannning XML docs and searching than would originally require by just calling the DB?

murple
06-26-2001, 06:55 PM
Hmm, no takers? Well i'm thinking that a XML call that scanned a series of files instead of one big XML doc would be best. I don't have DOMXML at my disposal on the host i'm currently using, so whatever i put together will be SAX based, so yay....loop and loop we go.

tupps
06-26-2001, 07:59 PM
I think the easiest solution is to buy a big stick to hit the DB Admin with! :-) Otherwise...

The main questions for me is:

How much information is in your database? Is all of this information loaded into flash every time or is there individual lookups?

What is causing the database to be updated? You a client?

My way of thinking is this:

If you have a database that is being updated by one person occasionally, that is selectively read into flash you could do the following:

After updating the database a post process is run whereby a number of XML files containing the information are created. A master XML file is also created specifying where these data XML files are on the server. These could then be read individually by flash movie when needed.

If this isn't going to work for you, give us some more information about what you are doing.


Thanks

Luke

murple
06-27-2001, 02:38 AM
the database is moderate in size, i'm getting about 200,000 hits a month and using php as the middle man. the data is split into 3 tables, admin stuff(member access logging),stats(often searched,moderately updated),settings(member updated,sometimes accessed).no database calls are needed to display the site, but a call is initiated when someone runs a search and thats when the admin falls asleep =) php makes a data request to the stat table and php sifts through the results. this table of course is the largest and i'm having trouble visualizing a strategy that would provide a performance gain, as putting the whole table into a single file would imaginably costly, i'd have to read the file in for each search and on top of that, the XML would keep getting overwritten as updates come in. A freind suggested using a caching method of search results but i don't see how i could use that for subsequent searches

tupps
06-27-2001, 02:52 AM
If you are doing searches you are going to have a lot of difficulty with try to cache data especially as people will be able ot type all sorts of things.

My guess is that your only real answer is the big stick option and hit your DB/Sys admin with it.

Thanks

Luke

murple
06-27-2001, 02:49 PM
sigh, i guess i'll have to get one with nails on the end lol