Ticket #865 (new enhancement)

Opened 16 months ago

Last modified 16 months ago

synchronisation

Reported by: annesley Owned by: ade
Priority: trivial Milestone:
Component: Unassigned Keywords:
Cc: chris, sam, ben Estimated Number of Hours: 0.0
Add Hours to Ticket: 0 Billable?: yes
Total Hours: 0.79

Description

ideas. please query them.

we are synchronising between different data structures: WordPress and Drupal and anything else the plugin is installed on. therefore standard *database level* distributed synchronisation management tools will not be appropriate. this is unfortunate because synchronisation is a big task. however, it is possible that there are some CRUD / REST based sync tools. so: we need an XML abstraction layer (partially done already) produced by the Drupal, Wordpress, etc. plugin that is standardised and can then be compared and synced via standard API calls.

Steps:

new Transition Town registration on server A
notify server B that there is new data and send the GUID of this new data
server B then requests only the new data from server A (incremental) using the GUID
server B creates the new item in it's database with a new native ID using the abstraction layer in it's plugin / module

addtions to this universal data pool, e.g. a new Transition Town, will be propagated via a network sync request at point of addition. "listener servers" will then request the new data (incremental only) and, in turn push that out to all other listeners.
each plugin will therefore extend and expose it's CRUD style synchronisation abstraction functions:

add-user
add-local-group
change-user
etc.

many of these are already available as part of the framework-independent plugin / module

currently, i suggest that ALL plugins contain ALL the international user and Transition Town data.
passwords and emails, contact info will be handled by a 3rd server, either Mozilla Persona or Open ID. user accounts will also be synchronised on to ALL plugins but without passwords as those are held on the 3rd server.
thus far had already been agreed with Ed. but, ofc, can be changed :)

new plugin installations will receive a full complement of data at time of installation. check digits will be periodically shared to check that all data is in-line. all users will be able to register and edit their data on ANY website holding the plugin. TT and USER changes and registrations will then propagate via PUSH notifications across the entire network
all native IDs will be different. i.e. TT Brixton will have a different ID on each server. thus, as always with synchronissation, all IDs will be transformed to GUIDs by the abstraction API and only GUIDs will be used to analyse the network of data and synchronisation.
login to any website containing the plugin will be transparent (unlike the demo i set up) through the normal wordpress and drupal login screens. the plugin will intercept failed authentication and attempt to authenticate against the universal servers.
new accounts created via universal registration on any server will have a framework specific configurable role and thus permissions on that server will be set by the administrator specific to that server.

Change History

comment:1 follow-up: ↓ 4 Changed 16 months ago by chris

  • Cc paul, ben added
  • Add Hours to Ticket changed from 0.0 to 0.25
  • Total Hours changed from 0.0 to 0.25

Do you have some figures on the churn rate for the data?

There are 479 Official Initiatives, how often are new ones added and how often are existing ones edited / removed?

I'm asking this because what you are describing sound complicated and complicated web projects are time consuming to design, implement and maintain, and time consuming web projects are expensive web projects...

If the Initiatives needing editing is just a few a week, say 4, and each one would take someone 30 mins of verification and manual data entry into the database and the person in charge of the database was paid £4 per hour then manually maintaining the list would cost less than £500 per year for this persons time -- I'm not convinced that a simple online database and a web form and perhaps an open email list and someone spending a few hours a week maintaining the data wouldn't be simpler and cheaper?

I got a phone call from the Yellow Pages (yell.com) last week to check our work details for entering in their database, this is a far bigger database than ours and yet they seem to do it manually to some extent, are we sure we are not trying to over engineer this?

comment:2 follow-up: ↓ 3 Changed 16 months ago by sam

Hi All


>  I'm asking this because what you are describing sound complicated and
>  complicated web projects are time consuming to design, implement and
>  maintain, and time consuming web projects are expensive web projects...
>
>
I think Chris raises a valid point here..

As I understand it there are three problems to solve;

1) Displaying a map/list of all the transition initiatives in the world on
TN.org, for reasons

2) Having a way to contact individual initiatives from 1).

3) Maintaining a backup of the data that is held on National hub sites so
if they close/get hacked/ go mental we still have a means of contacting the
individual initiatives

The synchronization does solve all three problems, but in a potentially
expensive way.


An alternative set of solutions might be;

1) Display data held on hubs directly on TN.org by doing API calls in real
time on the hubs data.

2) The data from 1) could include a link to that initiative profile
directly on Hub site. Then fairly trivial to work with hubs to make a
contact form that would send mail to that initiative. Could add a cc
address if we wanted to track these.

3) We could just work with hubs to make sure they have a decent backup
regime. This could be as simple as installing the free wp-dropbox backup
plugin on their sites.

I know we have been around this a few times, the full sync option is
sounding more complicated than I anticipated..

2p

Sam
























>  If the Initiatives needing editing is just a few a week, say 4, and each
>  one would take someone 30 mins of verification and manual data entry into
>  the database and the person in charge of the database was paid £4 per hour
>  then manually maintaining the list would cost less than £500 per year for
>  this persons time -- I'm not convinced that a simple online database and a
>  web form and perhaps an open email list and someone spending a few hours a
>  week maintaining the data wouldn't be simpler and cheaper?
>
>  I got a phone call from the Yellow Pages (yell.com) last week to check
> our
>  work details for entering in their database, this is a far bigger database
>  than ours and yet they seem to do it manually to some extent, are we sure
>  we are not trying to over engineer this?
>
> --
> Ticket URL: <https://tech.transitionnetwork.org/trac/ticket/865#comment:1>
> Transition Technology <https://tech.transitionnetwork.org/trac>
> Support and issues tracking for the Transition Network Web Project.
>

comment:3 in reply to: ↑ 2 ; follow-up: ↓ 18 Changed 16 months ago by annesley

Replying to sam:

Display data held on hubs directly on TN.org by doing API calls in real
time on the hubs data.

Hi Sam: the real-time querying option would cause every hub site to be called every time any TN.org webpage that needed a full hub list, like the international map, was requested. It adds 1000+* hits to every hub website every day. Any genuine load peak hitting the TN.org would also hit every hub website.

  • - @Chris: i think we average ~1000 visits per day?

comment:4 in reply to: ↑ 1 ; follow-up: ↓ 6 Changed 16 months ago by annesley

Replying to chris:

Hi Chris: i like your thinking. Not important to this thread but i would suggest £4 / hour is not a reasonable rate. And we currently have 979 registered Initiatives with 246 currently non-existent domains. Once TN has more to offer the Initiatives, and the registration process is easy with the IRS, we *might* see many many more registrations.

you are absolutely correct that sync jobs are expensive. *especially* if we have to do it with custom code which is currently looking like the only option.

one of the important deliverables agreed with Ed was that we should provide everyone else with a full complement of data rather than holding it only centrally. this is to encourage and enable innovation, ownership and non-centralised control.

Annesley

Do you have some figures on the churn rate for the data?

There are 479 Official Initiatives, how often are new ones added and how often are existing ones edited / removed?

I'm asking this because what you are describing sound complicated and complicated web projects are time consuming to design, implement and maintain, and time consuming web projects are expensive web projects...

If the Initiatives needing editing is just a few a week, say 4, and each one would take someone 30 mins of verification and manual data entry into the database and the person in charge of the database was paid £4 per hour then manually maintaining the list would cost less than £500 per year for this persons time -- I'm not convinced that a simple online database and a web form and perhaps an open email list and someone spending a few hours a week maintaining the data wouldn't be simpler and cheaper?

I got a phone call from the Yellow Pages (yell.com) last week to check our work details for entering in their database, this is a far bigger database than ours and yet they seem to do it manually to some extent, are we sure we are not trying to over engineer this?

comment:5 Changed 16 months ago by chris

  • Add Hours to Ticket changed from 0.0 to 0.1
  • Total Hours changed from 0.25 to 0.35

Replying to annesley:

  • - @Chris: i think we average ~1000 visits per day?

Sorry for the confusion, I wasn't asking about the number of visitors to the web site, I was asking how often the Official Initiatives database, which currently contains 479 entries, is updated -- how often is a new Initiative added? and how often is an existing Initiative edited? -- If this frequency low then trying to automate everything is potentially going to be far more expansive than manually maintaing the database, isn't it?

comment:6 in reply to: ↑ 4 Changed 16 months ago by chris

  • Add Hours to Ticket changed from 0.0 to 0.03
  • Total Hours changed from 0.35 to 0.38

Replying to annesley:

we currently have 979 registered Initiatives with 246 currently non-existent domains.

Where are those numbers from? How often is a new Initiative added to that list (and where is this list) and how often is an existing Initiative's data edited?

comment:7 Changed 16 months ago by annesley

for your additional information here is the full breakdown of our Initiative data websites with domain and platform use checking. note from this that around 30%, mostly WordPress installs, have their own website hosting and could install plugins:

[total checked] 979 100.00%
[failed] 246 25.13%
[platform:Joomla] 18 1.84%
[platform:SilverStripe] 1 0.10%
[platform:Weebly] 24 2.45%
[platform:blogger.com] 37 3.78%
[platform:drupal] 49 5.01%
[platform:facebook.com] 49 5.01%
[platform:flickr.com] 1 0.10%
[platform:groups.google.co.uk] 3 0.31%
[platform:groupsite.com] 3 0.31%
[platform:iWeb] 1 0.10%
[platform:meetup.com] 3 0.31%
[platform:ning.com] 48 4.90%
[platform:sites.google.com] 13 1.33%
[platform:transitionnetwork.org] 34 3.47%
[platform:unknown] 364 37.18%
[platform:wix.com] 1 0.10%
[platform:wordpress.com] 64 6.54%
[platform:wordpress.org] 266 27.17%

comment:8 follow-up: ↓ 11 Changed 16 months ago by annesley

good questions chris.

the ~1000 hits per day question was related to Sams suggestion of live API requesting to the Hubs, not your thoughts.

the initiative data that i took is from the Drupal backend view:
https://www.transitionnetwork.org/admin/reports/initiatives/detail

i do not have data on number of edits per day. but i would agree with you if we are going to say that it is very low and cheaper to do with person power.

however, we are trying to create major increases in engagement from the grass-roots with each other and with us. so it may become much more important to have sync possibility ready and available for much more data...

comment:9 Changed 16 months ago by chris

Replying to annesley:

i would suggest £4 / hour is not a reasonable rate

Sure, but it is more than I am paid and it is more than the UK Living Wage and 84 percent of world population subsists on under $20 per day -- I was suggesting that some "back of a fag packet" sums to work out how much this software project might cost, eg 500 hours of your time at £30 per hour would be £15k and then comparing this to the cost of someone manually maintaining the database would be worth doing before embarking on the project?

comment:10 Changed 16 months ago by annesley

note that, as i stated, we would want *every* plugin to have a full compliment of data. thus one edit / registration would need to be entered manually on 20+ different hubs, not just TN.org

comment:11 in reply to: ↑ 8 Changed 16 months ago by chris

  • Add Hours to Ticket changed from 0.0 to 0.02
  • Total Hours changed from 0.38 to 0.4

Replying to annesley:

the initiative data that i took is from the Drupal backend view:
https://www.transitionnetwork.org/admin/reports/initiatives/detail

Thanks, so 479 Official Initiatives and 500 Muller Initiatives.

Anyone have some figures for the rate of change in this data over time?

comment:12 Changed 16 months ago by annesley

i am paid £19 / hour. £4 / hour is only legal for under-18s in the UK. however, we could / should start employing people in other countries where possible? i'm not sure what TNs attitude is to this...

comment:13 follow-up: ↓ 15 Changed 16 months ago by sam

Hi All

>
>  Hi Sam: the real-time querying option would cause every hub site to be
>  called every time any TN.org webpage that needed a full hub list, like the
>  international map, was requested. It adds 1000+* hits to every hub website
>  every day. Any genuine load peak hitting the TN.org would also hit every
>  hub website.
>

1000 page loads is across all pages of the site. Maps are probably loaded
100-200? times a day.

Couldn't the hubs just have a cache for hubsite.org/xml that expires after
a day or so? Serving 100 requests a day from a cache shouldn't be too
onerous.

In addition couldn't we cache the page? Or is that tricky?

2p

Sam






>
>  * - @Chris: i think we average ~1000 visits per day?
>
> --
> Ticket URL: <https://tech.transitionnetwork.org/trac/ticket/865#comment:3>
> Transition Technology <https://tech.transitionnetwork.org/trac>
> Support and issues tracking for the Transition Network Web Project.
>

comment:14 Changed 16 months ago by chris

Replying to annesley:

i am paid £19 / hour. £4 / hour is only legal for under-18s in the UK.

Crap, so sorry, my "back of a fag packet" figures were supposed to be based on £8 per hour -- I wanted to pick a figure above the UK Living Wage.

comment:15 in reply to: ↑ 13 Changed 16 months ago by chris

  • Add Hours to Ticket changed from 0.0 to 0.14
  • Total Hours changed from 0.4 to 0.54

Replying to sam:

1000 page loads is across all pages of the site.

No, the rough figure of 1k is unique visitors excluding bots.

Maps are probably loaded 100-200? times a day.

Looking at the stats for the map for this month (two weeks) we have:

  • 382 Page views
  • 248 Unique page views
  • 00:01:50 Avg. time on page

For this year (1st Jan 2015 to 15th July 2015) we have:

  • 7419 Page views
  • 5270 Unique page views
  • 00:01:40 Avg. time on page

So, in total around 185 hours has been spent by people looking at the map, so far, in 2015 (7419*1.5=11128.5 and 11128/60=185.4) -- we are in danger of spending more time writing code to generate the map than people spend looking at the map?

comment:16 follow-up: ↓ 17 Changed 16 months ago by sam

>
> Hi all
>


>  > Maps are probably loaded 100-200? times a day.
>
>  Looking at the [https://stats.transitionnetwork.org/ stats] for
>  [https://www.transitionnetwork.org/initiatives/map the map] for this
> month
>  (two weeks) we have:
>
>  * 382 Page views
>  * 248 Unique page views
>  * 00:01:50 Avg. time on page
>


Sorry I was using 'maps' as a shorthand for 'Data from the initiative
database' In addition the Map itself appears in several pages.

It's probably all the numbers for 'initiatives' here

https://stats.transitionnetwork.org/index.php?module=CoreHome&action=index&idSite=1&period=range&date=previous30#/module=Actions&action=menuGetPageUrls&idSite=1&period=range&date=previous30



But I think the main point is that it isn't a big number, and almost
certainly less than 100 calls a day on that dataset.

Thanks

Sam



>
>  For this year (1st Jan 2015 to 15th July 2015) we have:
>
>  * 7419 Page views
>  * 5270 Unique page views
>  * 00:01:40 Avg. time on page
>
>  So, in total around 185 hours has been spent by people looking at the map,
>  so far, in 2015 (7419*1.5=11128.5 and 11128/60=185.4) -- we are in danger
>  of spending more time writing code to generate the map than people spend
>  looking at the map?
>
> --
> Ticket URL: <https://tech.transitionnetwork.org/trac/ticket/865#comment:15
> >
> Transition Technology <https://tech.transitionnetwork.org/trac>
> Support and issues tracking for the Transition Network Web Project.
>

comment:17 in reply to: ↑ 16 Changed 16 months ago by chris

  • Add Hours to Ticket changed from 0.0 to 0.09
  • Total Hours changed from 0.54 to 0.63

Replying to sam:

It's probably all the numbers for 'initiatives' here

https://stats.transitionnetwork.org/index.php?module=CoreHome&action=index&idSite=1&period=range&date=previous30#/module=Actions&action=menuGetPageUrls&idSite=1&period=range&date=previous30

Thanks, so, for the last two weeks we have:

  • 2401 page views
  • 1720 unique page views

That is around 170 page view per day.

And for this year (call it 6 months, it's just over) we have:

  • 44545 page views
  • 32427 unique page views

That is more like 250 page views per day (30*6=180 44545/180=247).

It's worth noting that the Piwik stats are people -- there will be twice or even three or four times as many visits from bots.

comment:18 in reply to: ↑ 3 Changed 16 months ago by chris

  • Add Hours to Ticket changed from 0.0 to 0.01
  • Total Hours changed from 0.63 to 0.64

Revisiting this:

Replying to annesley:

  • - @Chris: i think we average ~1000 visits per day?

Yes, 1k requests per day from bots and people for pages with maps on is a realistic figure.

comment:19 Changed 16 months ago by sam

> An alternative set of solutions might be;
>
> 1) Display data held on hubs directly on TN.org by doing API calls in real
> time on the hubs data.
>

With a cached /xml file on the hub sites I don't see this would be a
problem.

Just had a look at http://umap.openstreetmap.fr/ and this does remote data
fields out of the box & could provide us with an embeddable map to go on
tn.org/ belgiumhub.org etc etc

All we'd need to do is make the IIRS output in geojson, then plug in the
URL's into umap & share the embed code.

There are various options for theming the information that appears in the
pop up.

I'm sure we could hack it together in Drupal, but it would probably take
some £time..

2p

Sam






>
> 2) The data from 1) could include a link to that initiative profile
> directly on Hub site. Then fairly trivial to work with hubs to make a
> contact form that would send mail to that initiative. Could add a cc
> address if we wanted to track these.
>
> 3) We could just work with hubs to make sure they have a decent backup
> regime. This could be as simple as installing the free wp-dropbox backup
> plugin on their sites.
>
> I know we have been around this a few times, the full sync option is
> sounding more complicated than I anticipated..
>
> 2p
>
> Sam
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>>  If the Initiatives needing editing is just a few a week, say 4, and each
>>  one would take someone 30 mins of verification and manual data entry into
>>  the database and the person in charge of the database was paid £4 per
>> hour
>>  then manually maintaining the list would cost less than £500 per year for
>>  this persons time -- I'm not convinced that a simple online database and
>> a
>>  web form and perhaps an open email list and someone spending a few hours
>> a
>>  week maintaining the data wouldn't be simpler and cheaper?
>>
>>  I got a phone call from the Yellow Pages (yell.com) last week to check
>> our
>>  work details for entering in their database, this is a far bigger
>> database
>>  than ours and yet they seem to do it manually to some extent, are we sure
>>  we are not trying to over engineer this?
>>
>> --
>> Ticket URL: <https://tech.transitionnetwork.org/trac/ticket/865#comment:1
>> >
>> Transition Technology <https://tech.transitionnetwork.org/trac>
>> Support and issues tracking for the Transition Network Web Project.
>>
>
>

comment:20 Changed 16 months ago by annesley

@Sam: note that, as i stated, we would want *every* plugin to have a full compliment of data, not just us.

the page load time for the TN map would also be extremely long with it trying to resolve 20+ calls to hubs that may not necessarily all be up.

caching is generally bad news. Drupal cacheing is one of the reasons it is hell to program against compared to something like WordPress. we end up in the situation where people register and then can't find themselves on the map. also cacheing is, too some degree the equivalent of synchronising the data to servers that request it. allbeit easier, and also on demand.

comment:21 Changed 16 months ago by sam

Hi all

 @Sam: note that, as i stated, we would want *every* plugin to have a full

>  compliment of data, not just us.
>

Yes I know you said that, what I'm exploring is whether we actually _need_
to do that. For me the place for Belgian data is on the Belgian hub. So
long as we can interrogate it why do we need to copy it?


>  the page load time for the TN map would also be extremely long with it
>  trying to resolve 20+ calls to hubs that may not necessarily all be up.
>

Maybe, but that's a case for supporting the hubs to have good policies in
place around caching, minifying, page size etc to reduce load on their
servers. & making sure they spec a suitable server for their load.

  caching is generally bad news.

I'm not sure I agree! I'd have to spend much more on the sites I host if I
just kept chucking resources at the server rather than caching. Sure it can
be a pain in the arse, but such is life..

2p

Sam


Drupal cacheing is one of the reasons it is

>  hell to program against compared to something like WordPress. we end up in
>  the situation where people register and then can't find themselves on the
>  map. also cacheing is, too some degree the equivalent of synchronising the
>  data to servers that request it. allbeit easier, and also on demand.
>
> --
> Ticket URL: <https://tech.transitionnetwork.org/trac/ticket/865#comment:20
> >
> Transition Technology <https://tech.transitionnetwork.org/trac>
> Support and issues tracking for the Transition Network Web Project.
>

comment:22 follow-up: ↓ 30 Changed 16 months ago by chris

  • Add Hours to Ticket changed from 0.0 to 0.15
  • Total Hours changed from 0.64 to 0.79

Going back to the big picture, with a view that, generally, automating tasks only makes sense if the automation is cheaper that doing the task manually. We have a database:

  • With 1,000 entries
  • Viewed by people around 250 times a day (once every 5 mins)
  • Accessed by robots perhaps 750 times a day

Assuming 1 entry needs added / editing per week day and each edit takes 30 mins and the person doing the editing is paid £8 per hour, then we have a rough cost for manually maintaining the database of:

5x30=150 mins per week or 2.5 hours

2.5x52=130 hours per year

130x8=1040 pounds per year

So it seems to me that if it will cost more than a couple of grand to automate the updating of this database it would probably be cheaper to do it manually?

Have I made any big mistakes in this "back of a fag packet" calculation?

Last edited 16 months ago by chris (previous) (diff)

comment:23 follow-up: ↓ 24 Changed 16 months ago by sam

Hi  Chris

 We have a database:
>

The point is that actually 'we' as in 'transition movement' rather than
'the charity transition network' actually have several incomplete,
overlapping databases.

US maintains their own initiative database, as does Norway, now Belgium and
probably others.

The question is whether the 'we have a big database and everyone should use
it' model is relevant.

Do we _need_ to maintain this database at all, if we can instead
interrogate the National databases.

If the answer is yes we are going to stick with our colonial monolithic
model, then probably manual updating would be cost effective.

But we may be able to so something that eliminates that admin overhead
almost completely; allow hubs to own their own data and interrogate it via
api to provide a global view.

2p

Sam

comment:24 in reply to: ↑ 23 Changed 16 months ago by chris

Replying to sam:

we may be able to so something that eliminates that admin overhead almost completely

Does any know what the current admin overhead is?

allow hubs to own their own data and interrogate it via api to provide a global view.

I agree that an efficient and elegant technical implementation that allows this would be very cool, but what might it cost?

comment:25 Changed 16 months ago by sam

Hi all


>  Does any know what the current admin overhead is?
>

I probably spend an hour or two a week dealing with issues relating to the
existing initiative database.

So £600 - £1000 a year. But it's worth noting the current situation &
architecture isn't working. The database is very out of date & out of sync
with the US/ Norway etc

  I agree that an efficient and elegant technical implementation that allows

>  this would be very cool, but what might it cost?
>

Well there is an existing XML API:
http://annesley.parrot.transitionnetwork.org/IIRS/export

Not sure what it would take to turn it into GeoJSON:
https://en.wikipedia.org/wiki/GeoJSON Annesley?

Then maybe an hour or two to set up Umap & embed it?

Providing support and advice to people running hub websites to make sure
they are backed up and loading fast could take a long time potentially, but
I'd argue it's exactly the kind of work TN should be doing..

2p

Sam

comment:26 Changed 16 months ago by annesley

Sam:

the page load time for the TN map would also be extremely long with it
trying to resolve 20+ calls to hubs that may not necessarily all be up.

Maybe, but that's a case for supporting the hubs to have good policies in
place around caching, minifying, page size etc to reduce load on their
servers. & making sure they spec a suitable server for their load.

Sorry Sam, but that is not going to happen: we and the hubs absolutely do not have the time and money to ensure all data sources are always up and running fast. Querying 20+ other websites for data during one single page call is *very* bad design. Even if all 20+ data sources were well funded professional hosting arrangements it would still be a *very* bad idea. To be clear: every time that *any* of the data sources was down, the page load time for the TN map would hit the down data source timeout for *every* visitor during that period.

@Chris: i think you would agree here no?

comment:27 Changed 16 months ago by annesley

@Sam: the reason that Ed decided to implement full data sharing across the whole network was because it enabled and encouraged innovation with the data on the hubs.

it was decided that innovation can happen at hub level *much* more easily if the data appears natively in the hub database. hub programmers can easily build tools around local data repositories using standard plugins without needing to do any custom code. it will be *much* more difficult for them to try and programmatically deal with API calls to TN, cacheing and resultant custom display.

it also was non-centralised, non-English-centric model.

comment:28 Changed 16 months ago by annesley

i *am* actually expecting to see technical and community innovation from places like Belgium once they have easy local database access to everything.

comment:29 Changed 16 months ago by annesley

i want to throw a recent thought in to the mix to give an idea of where i am hoping the system might expand to:

i have a script that can automatically periodically copy / cache *all* posts, events and comments with geo-data from *all* Transition Town websites in to a local database and categorise them.

then it is simple to write a Wordpress plugin that dynamically live lists all related posts from the entire Transition-o-sphere during when someone is typing a new post based on what they are typing. i have done this sort of system once already. this gives a very connected blogging experience.

whilst that may or may not be a good idea, my point is: i see our job as providing data, and the hubs job as innovating with it. an essential component of what we must do is enable the IT and community innovators around the world to innovate.

that is why the data should be on every local server that it can be.

comment:30 in reply to: ↑ 22 Changed 16 months ago by annesley

Replying to chris:

you need to include all the other costs of employing someone. this roughly equates to doubling the amount because of all the overheads and tax that an employer must pay. so double it to start.

secondly, the data entry person would need to check all the hub sites to try and find out if there had been any new TTs or any changes.

thirdly, again this only allows for the full data set to be on TN. there are many requirements that the data is everywhere. for example: many hubs want to show the international map. that will take additional plugin IFRAME development if the data is not local.

fourthly, there would be a delay between the TT registering or changing data and that being reflected on the TN website.

fithly, it's a McJob?.

Going back to the big picture, with a view that, generally, automating tasks only makes sense if the automation is cheaper that doing the task manually. We have a database:

  • With 1,000 entries
  • Viewed by people around 250 times a day (once every 5 mins)
  • Accessed by robots perhaps 750 times a day

Assuming 1 entry needs added / editing per week day and each edit takes 30 mins and the person doing the editing is paid £8 per hour, then we have a rough cost for manually maintaining the database of:

5x30=150 mins per week or 2.5 hours

2.5x52=130 hours per year

130x8=1040 pounds per year

So it seems to me that if it will cost more than a couple of grand to automate the updating of this database it would probably be cheaper to do it manually?

Have I made any big mistakes in this "back of a fag packet" calculation?

comment:31 Changed 16 months ago by sam

>
>
>  Sorry Sam, but that is not going to happen: we and the hubs absolutely do
>  not have the time and money to ensure all data sources are always up and
>  running fast.


Well no not 100% uptime, but choosing a decent host and selection of
plugins isn't that hard a problem for hubs to solve. 96- 98% uptime should
be fairly achievable with little effort or expenditure.

  To

>  be clear: every time that *any* of the data sources was down, the page
>  load time for the TN map would hit the down data source timeout for
>  *every* visitor during that period.
>

I don't think that's quite how it works actually. Data is loaded
asynchronously.

So user lands on page

- Map tiles load - 2 seconds or so
- site1.org/georss loads in 1.5seconds as they have a nice server, their
data appears on the map before the tiles have even loaded.
- site2.org/georss loads in 3 seconds as they have a cheap server, their
data appears on the map
- site3.org/georss is down for some reason, so their data fails to load.
after the default browser timeout the page will stop loading and any sites
that haven't returned data won't appear, but all those that have will
appear.

2p

Sam









>
>  @Chris: i think you would agree here no?
>
> --
> Ticket URL: <https://tech.transitionnetwork.org/trac/ticket/865#comment:26
> >
> Transition Technology <https://tech.transitionnetwork.org/trac>
> Support and issues tracking for the Transition Network Web Project.
>

comment:32 Changed 16 months ago by annesley

@Sam: are you suggesting that the client web browser should make the data calls?
or the server PHP should run all the data calls?

comment:33 Changed 16 months ago by annesley

and don't forget there are several functions already requiring this data on TN. for example: find a TT near you. lists of TTs. how will they work?

comment:34 Changed 16 months ago by paul

Hello Gentleman,

Would you remove me from this conversation. This is all going over my head
:)

Have a good weekend

Paul

On Fri, Jul 17, 2015 at 4:56 PM, Transition Technology Trac <
trac@tech.transitionnetwork.org> wrote:

> #865: synchronisation
> -----------------------------------+---------------------------------
>            Reporter:  annesley     |                      Owner:  ade
>                Type:  enhancement  |                     Status:  new
>            Priority:  trivial      |                  Milestone:
>           Component:  Unassigned   |                 Resolution:
>            Keywords:               |  Estimated Number of Hours:  0.0
> Add Hours to Ticket:  0            |                  Billable?:  1
>         Total Hours:  0.79         |
> -----------------------------------+---------------------------------
>
> Comment (by annesley):
>
>  and don't forget there are several functions already requiring this data
>  on TN. for example: find a TT near you. lists of TTs. how will they work?
>
> --
> Ticket URL: <https://tech.transitionnetwork.org/trac/ticket/865#comment:33
> >
> Transition Technology <https://tech.transitionnetwork.org/trac>
> Support and issues tracking for the Transition Network Web Project.
>



-- 
Best Regards

Paul Booker
Drupal Support for Websites and Linux Servers
Website: http://www.paulbooker.co.uk
Drupal.org: https://www.drupal.org/u/paulbooker
Tel: +44 01922 861636

comment:35 Changed 16 months ago by sam

Hi Annesley

I'm not entirely sure what you mean?

Here is an example of remotely loaded data in Umap:
http://umap.openstreetmap.fr/en/map/pubs-live-data_16071#10/52.1946/-1.7008

About this page:
http://www.mappa-mercia.org/2014/09/creating-an-always-up-to-date-map.html

Looking at the page analysis tab here:
http://tools.pingdom.com/fpt/#!/dCEd8e/http://umap.openstreetmap.fr/en/map/pubs-live-data_16071#14/52.1946/-1.7008

You can see that it requested 340kb from overpass-api.de (The source
of the data in this case)

The whole page loads in 1.74 seconds.

Thanks

Sam


On 17 July 2015 at 16:55, Transition Technology Trac
<trac@tech.transitionnetwork.org> wrote:
> #865: synchronisation
> -----------------------------------+---------------------------------
>            Reporter:  annesley     |                      Owner:  ade
>                Type:  enhancement  |                     Status:  new
>            Priority:  trivial      |                  Milestone:
>           Component:  Unassigned   |                 Resolution:
>            Keywords:               |  Estimated Number of Hours:  0.0
> Add Hours to Ticket:  0            |                  Billable?:  1
>         Total Hours:  0.79         |
> -----------------------------------+---------------------------------
>
> Comment (by annesley):
>
>  @Sam: are you suggesting that the client web browser should make the data
>  calls?
>  or the server PHP should run all the data calls?
>
> --
> Ticket URL: <https://tech.transitionnetwork.org/trac/ticket/865#comment:32>
> Transition Technology <https://tech.transitionnetwork.org/trac>
> Support and issues tracking for the Transition Network Web Project.

comment:36 Changed 16 months ago by annesley

right so the client browser is loading data. yes, that is asynchronous and the data loads as it appears.

how will the system know where to query the data from?

how will TN websites carry out other functions requiring the data? e.g. find a TT near you and alphabetical lists of TTs or search the TT directory, etc.

comment:37 Changed 16 months ago by annesley

  • Cc paul removed
Note: See TracTickets for help on using tickets.