Expo Computer & website

Wookey

Data Storage

  • Mercurial repositories
    • website ('expoweb')
    • troggle
    • loser
    • tunneldata
  • expofiles - just files
    • photos
    • presentations
    • posters
    • maps
    • documents
    • ...

On expo

  • Expobox as server:
    • expo.potato.hut
    • full debian repo available
  • Expobox as desktop
    • local checkouts  in expofiles
    • raw files in expofiles/expoimages
    • sources/docs/utils for
      tunnel, survex, therion, SAP, DistoX

Website Structure

Static web pages

indexes, festering, background, handbook

Generated web pages


prospecting guide

Troggle pages

cave pages,survey scans, logbook entries, triplists


Troggle 'wraps' the other stuff



WEBSITE LAYOUT

  • years has per-year info (reports, lists, policy, logbooks)
  • handbook contains handbook
  • 1626, 1623 are the areas
    • should be merged - its just confuses
  • noinfo   - dir used to be hidden on public web
    • scripts are in noinfo/
  • generated files are regenerated with 'Make' in expoweb dir
      (click on 'others'-> 'Regenerate website' is the same)
    • prospecting guide is in noinfo/prospecting_guide_scripts
  • folk is people list, now generated from noinfo/folk.csv by noinfo/make-folklist.py


Editing


Don't try to edit generated files.

Can edit in various ways:
  • In expo file manager (best)
  • Live on website (not recommended)
  • (Clone and) edit on another computer


You always have to commit afterwards, however you edited.



Managing Files


EVERYTHING should be on the website!

(HUGE BINARIES SHOULD NOT BE IN VCS)


Info should go in the website handbook (as html, or PDF)

Layout files edited every year make sense as ODT docs

(survey index, labels, accounting procedure)

Link from website, store in website or expoimages


Don't just create a random doc and leave it in 'Desktop'.

Think about where some info would be found

Survex Dataset

  • caves
  • surface
  • surface/terrain
  • fixedpts
  • template: example files


  • docs/datamaintenance.txt has some rules.
    Should be in the handbook
  • surface/caves connected by ents.svx, which is generated by make-ents.py, which fishes out every point marked with *entrance in the dataset.  ARGE do this wrong, with *fix in cave
  • all* make various groupings of data

Adding a CAVE

  • Location
  • Photo
  • Survey data
  • Survey notes
  • Description
    • rigging guide
  • Tag status
  • Question marks
  • Drawn up (paper/tunnel/therion)


You are not finished until all of these are done


ACTUALLY ADDING A CAVE

       Survey data in loser/caves/<number>

       Simple caves go in:

            location, name, description in expoweb/noinfo/1623/cave_data

       Complex caves go in:

             expoweb/1623/<cavenumber>

ACTUALLY ADDING A CAVE - pt2

        Notes go into physical survey folder. Scans go into expofiles/expo_images_surveyscans/<year>/<year>-<folderID>

        QMs don't seem to have a plan. Needs standarising.

Caves with no number are called <year>-<initials>-<nn> and go in file expoweb/noinfo/cave_data/<kataster_area>-<year>-<initials>-<nn>

Troggle 

Django-based.

URL list filter /cave, /expedition, /survey

Templates for pages

Processors to generate them

Parsers to read in from website files.

Database back end (mysql)


The files are the 'real' database.

Run databaseReset.py reset

Troggle updating


For a new year, edit expoweb/noinfo/folk.csv

This is the thing that 'creates' a new 'expedition' (not the logbook!)

Run databaseReset.py reset

to reparse everything.


LOADS MORE I DIDN't HAVE TIME FOR


....




:-)