Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Import Nación Rotonda's data #40

Open
2 tasks done
ccamara opened this issue Feb 22, 2016 · 4 comments
Open
2 tasks done

Import Nación Rotonda's data #40

ccamara opened this issue Feb 22, 2016 · 4 comments
Assignees
Labels
Milestone

Comments

@ccamara
Copy link
Member

ccamara commented Feb 22, 2016

Data from Nación rotonda is stored in a Fusion Table which exports results in KML format which has to be imported on the website.

  • Create parser
  • Import and assign all values as "Desenterrados"
@ccamara ccamara added this to the Alpha 1 milestone Feb 22, 2016
@ccamara ccamara self-assigned this Feb 22, 2016
@ccamara ccamara changed the title Create a KML Parser for Nación Rotonda's data Import Nación Rotonda's data Feb 22, 2016
@ccamara
Copy link
Member Author

ccamara commented Feb 22, 2016

Unfortunately the feeds_kml_parser module is not mature enough and there is no way to map location into separate fields of latitude and longitude, which are the ones we are using.

We should find a workaround consisting on cleaning the data first and then creating a regular csv importer

@ccamara
Copy link
Member Author

ccamara commented Feb 22, 2016

Quoting from email:

[...] La buena noticia es que me he picado conmigo mismo y he adaptado el script de R para manipular los datos de fusion tables y hacerlos usables para nuestros fines. El resultado lo podéis ver en este script (concretamente líneas 37 en adelante). Seguramente no sea el código más limpio, pero funciona. El procedimiento sería siguiente:

  1. trabajar con fusion tables como hasta ahora,
    
  2. generar un csv
    
  3. guardarlo en la carpeta data/input
    
  4. correr el script de R
    
  5. ir a la web http://cinmobiliarios.carloscamara.es/es/import/corpses_csv (o la versión estable: http://new.cadaveresinmobiliarios.org/import 
    
  6. ), adjuntar el archivo y hacer clic en aceptar.
    

El resultado de la importación de los 823 cadáveres es este: http://cinmobiliarios.carloscamara.es/es/map . La noticia aún más buena es que podemos usar el mismo importador de csv, lo cual permite encontrar duplicados. En este caso ha detectado que había 14 cadáveres con el mismo nombre y por tanto solo ha actualizado los datos.

So only need to wait till all the data is ready and then run the script and importer.

@ccamara
Copy link
Member Author

ccamara commented Feb 22, 2016

We'll have to fix the mapping to the URL field, as it doesn't seem to work.

@ccamara
Copy link
Member Author

ccamara commented Feb 23, 2016

Fixed. We only need to wait till all the data is ready.

@ccamara ccamara modified the milestones: Alpha 2, Alpha 1 Sep 9, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant