Geodesign Technologies
PhD course: 26.-30. August 2019

© University of Copenhagen and Echtzeit GmbH

The PhD course will address Geodesign Technologies as means of recording, representing, translating, and communicating the Real World in urban and landscape planning/design processes. The course goes beyond classic methods based on physical models made from cardboard or clay and well-established digital techniques framed by 2D GIS and 3D CAD. Accordingly, the aim of the course is to embrace, demonstrate and discuss new, emerging technologies juxtapositioning the existing World on one hand, and its digital vs physical representations on the other.

Geodesign is a collective method for planning and design of our physical and social environment. It involves an iterative process of realization, describing, and analysing problems and potentials of our world, and sketching, formulating, co-creating, designing, and testing proposals. Thus, the implementation of a geodesign process encompasses multiple scales, themes, professions, and stakeholder groups. Many people and organisations rather than individual persons perform geodesign. Accordingly, qualified information exchange and mutual respect and understanding between participants is a core virtue.

Geodesign technologies strives to enable mediation of the processes and information exchanges involved. The course constitutes in particular the following topics:

  • Sandboxes: Tangible vs landscapes. Digital and physical representations of the terrain.
  • Augmented, virtual, and mixed realities
  • Drones for mapping, 3D modelling, and film making
  • Perception, aesthetics and creativity in relation to Geodesign Technologies in planning/design processes

Teaching will involve fundamental theoretical knowledge, discussion over concrete applications, and hands-on exercises involving both technology and design assignments. Prior to the course students must acquaint themselves with the course literature and prepare a short presentation of their PhD projects including expectations on how the course can contribute to their work.

Link to course flyer

Link to formal enrollment page

Student profile


Students should be en engaged in landscape and urban design and related, spatial technologies (including GIS and CAD), but students specialised primarily one one of the two fields are encouraged to attend. During workshops and exercises, groups will be set up to constitute as large diverse of disciplines and interests as possible - i.e. involving competences within design, technology and potentially also domain knowledge (e.g. surface water, environment, urban life, accessibility etc.). Specific operational knowledge about the involved software (including Rhino) is not a prerequisite.

Program

Drones and point clouds

Point clouds constitutes models of objects (landscapes, buildings, trees etc.) represented as 3D points. They are often very dense to enable representation of even small details of the objects. Either based on photogrammetry (comparison of images captures from different angles) or LIDAR (Light Detection and Ranging) are applied when generating point clouds. Recordings can be made on ground with stationary or mobile units and from the sky, either from drones, airplanes and satellites.

Within this session students will learn the workflow of capturing a site through the integration of drone based and as terrestrial, mobile equipment based on LIDAR techniques. For the session a dense point cloud produced for the entire case area will be provided.

 Understanding the complexity of our environment is the basis for any informed design/planning process. In addition we will introduce the basic concepts of utilising the point cloud within a design/planning process. On a technical point of view, we will be looking into categorisation workflows, animations, and insights into various hands-on tools for decomposing the high-resolution point cloud in order to push the limits of design/planning.

Figure: A point cloud representing a campus yard at IGN. © IGN, 2019

Figure: A point cloud of a tree. © 

Sandboxes. Tangible vs digital landscapes.

A digital sandbox is a setup where a tangible landscape model shaped in natural or polymer sand is interfaced with a digital model via a scanner (in this case a Kinnect 2/3)) and a projector. The sand models are scanned and analysed by the computer (in this case embedded in the CAD program Rhino) which again renders results and other spatial information onto the sand. E.g. contour lines and other terrain derivatives (colours representing elevation, slope and visibility), drainage patterns and flow lines, ortho photos, topographic maps, land use classification etc.

The sand models can represent both real landscapes (applied from existing DEM data) and less constrained, more creative sceneries.

Within this session students will learn the potential of using interactive simulations as a design/planning supportive tool. The set-up empowers the students to explore the importance of understanding topography in relation to dynamic environmental forces. Through combining a physical model with a context-specific simulation, the iterative design process will be guided by informed decisions based upon real-time feed-back. The students will learn basic elements of Grasshopper and Rhino (flow simulations).

In addition the session will discuss possibilities for integrating interactive environments like the “Digital Sandbox” within participatory processes.

Figure: The Rhino Sandbox. © IGN

Figure: A Sandbox. and students. © IGN

Augmented reality (AR)

Augmented Reality (AR) or mixed reality constitutes interactive technologies for superimposing digital objects on the immediate, visible environment, often captured by mobile, camera based sensors – typically mobile phones, pads, or glasses/googles.

AR brings features of the digital world into an individual person's perception of the real world. Not simply by displaying data, but through the integration of sensing, experienced immediately in the landscape versus imagined and proposed scenarios and designs.

In this session students will learn the workflow of transformation of digital models into AR and evaluation and communication of proposed scenarios directly in the case areas.

Figure: Augmented Reality. Wind turbines in the Swizz alps. © EchtZeit, 2019

Credits and criteria

Credits: 5 ECTS.
To pass the course students must read obligatory course literature, attend the course on site, provide a short, verbal reflection of students’ own PhD-project, make a presentation of the design developed during the course (including technological reflections), and finally produce a 4-5 page manuscript covering the students’ reflection of their PhD project vs. the topics of the course.

Tutors/lecturers

  1. Dylan Cawthorne, Associate Professor. SDU UAS Centre (Unmanned Aerial Systems), The Maersk Mc-Kinney Moller Institute
  2. Patrick Moechel, Echtzeit GmbH
  3. Thomas Ott, Echtzeit GmbH
  4. Kane Borg, PhD candidate, University of Aalto PhD candidate
  5. Rikke Munck Petersen, Associate Professor, IGN, UNICPH
  6. Assistant Professor, Anne Wagner, IGN, UNICPH
  7. Associate Professor. Lene Fischer. IGN, UNICPH
  8. nn GeoTeam (Trimble Distributor)
  9. Hans Skov-Petersen, Senior Researcher. IGN. UNICPH
  10. Pia Fricker. Professor of Practice for Computational Methodologies in Landscape Architecture and Urbanism), Aalto University, Finland
  11. Mariusz Hermansdorfer. Industrial PhD fellow . IGN, UNICPH and Rambøll

Course organizers

  1. Hans Skov-Petersen, Senior Researcher. IGN. UNICPH. Course responsible and corresponding organizer.
    E-mail: hsp@ign.ku.dk
    Mobile: +45 23 82 80 45.
  2. Pia Fricker. Professor of Practice for Computational Methodologies in Landscape Architecture and Urbanism), Aalto University, Finland
  3. Mariusz Hermansdorfer. Industrial PhD fellow . IGN, UNICPH and Rambøll