tag:blogger.com,1999:blog-11683713.post5921938646731780929..comments2024-01-12T21:16:50.520-08:00Comments on Spyced: Google App Engine: Return of the Unofficial Python Job Board FeedJonathan Ellishttp://www.blogger.com/profile/11003648392946638242noreply@blogger.comBlogger5125tag:blogger.com,1999:blog-11683713.post-40160100876444381522008-07-11T11:09:00.000-07:002008-07-11T11:09:00.000-07:00Well, I posted the source. Patches accepted. :)Well, I posted the source. Patches accepted. :)Jonathan Ellishttps://www.blogger.com/profile/11003648392946638242noreply@blogger.comtag:blogger.com,1999:blog-11683713.post-26048147266235818502008-07-11T10:48:00.000-07:002008-07-11T10:48:00.000-07:00This is a really useful feed. Thanks for making i...This is a really useful feed. Thanks for making it.<BR/><BR/>It would be helpful, though, if the location information of the positions wasn't lost in the translation. Any possibility of adding this?Kevin Hhttps://www.blogger.com/profile/03013155163736662271noreply@blogger.comtag:blogger.com,1999:blog-11683713.post-32966365333144452242008-04-11T01:30:00.000-07:002008-04-11T01:30:00.000-07:00I considered doing that, but the complexity isn't ...I considered doing that, but the complexity isn't worth it here.<BR/><BR/>The complexity is, what if two requests come in at the same time? You need some kind of locking to prevent duplicate entries; query-before-insert is still race-condition prone. Locking isn't provided by the GAE either, so you'd have to hack something using Datastore. (I'm assuming that the transactional semantics are strong enough that this is even possible; I haven't checked.)<BR/><BR/>So even for a simple case like this one, it's not as easy to work around the lack of scheduled tasks as you'd think. It really is a critical part of any non-toy application environment.Jonathan Ellishttps://www.blogger.com/profile/11003648392946638242noreply@blogger.comtag:blogger.com,1999:blog-11683713.post-86022120997514071172008-04-11T01:24:00.000-07:002008-04-11T01:24:00.000-07:00If there were a scheduled task api, my feed genera...<I>If there were a scheduled task api, my feed generator could poll the python jobs site hourly or so, and store the results in the Datastore, instead of having a 1:1 ratio of feed requests to remote fetches.</I><BR/><BR/>You could probably store the results in the Datastore, along with the timestamp, and update it during request every hour or so ("if the data older than 1 hour, than fetch again"). That would make some requests taking more time, but at most once in an hour.Kseniahttps://www.blogger.com/profile/15945796671936684814noreply@blogger.comtag:blogger.com,1999:blog-11683713.post-36440890934505631252008-04-10T09:28:00.000-07:002008-04-10T09:28:00.000-07:00This is great! Thank you!This is great! Thank you!jekhttps://www.blogger.com/profile/11055050403333279997noreply@blogger.com