More selected projects

::scAssemblage

Part of Uterii research project (uterii), ::scAssemblage is the first custom-made visualization apparatus created to support Uterii experimental moments.

by: J.P.Gaste

Abstract

::scAssemblage is the first of a series of custom-made, open-access, machine concepts which looks into machine vision and image processing algorithms used in gyneco-technologies. As accountability often lays in overlooked protocols hidden beneath technological black-boxes, it was essential for me to examine those processes and reflect upon them through custom-made visualization tools which could enable new stories to be told. Following ideas of Human-machine reconfiguration introduced by Suchman, Uterii looks at machine as potential collaborators in the reconfiguration of “human-ness” and subsequently “uterus-ness.” Relatedly to Suchman understanding of human-ness, I seek to distinguish and reveal how uteri are seen through machines as a reflection of the human conception of the uterus.

 

Concept

scAssemblage looks into the potential of 3D scanning using Kinect (a field of machine vision) to turn single scans into unpredictable point cloud "assemblages" later usable as "uncanny" CGI models. Through a transparent interface (e. g., the grayscaling process is visible and controllable), the machine displays some of the step involved in the making of those 3D assemblages. Further, I implemented touchOsc controllers as to make the scanning process remotely easier. The transparency and simplicity of the scanning process are here to inform the user of some, relatively simple, algorithmic decisions undertook to create 3D renderings from physical objects. In simplifying the acquisition steps, the machine allows us to see how engineers engineered the ways they make our world viewable. Further, in including a random function in the code, I tried making the machine participation more explicit. Indeed, this random function merges all the 3D meshes of the uterus into one accumulated 3D mesh. The randomization allocates each mesh on a random x, y, z axis, creating unique assemblages facilitated by the machine decision and the user co-dependency upon his companion. Although less informative than the upcoming development of Uterii machines around more specific procedures, scAssemblage was created as a form of poetic acknowledgement of Uterii (human and machine) first participation and its promising visual manifestation.

Machine technical description

scAssemblage 3D scanning apparatus consists of three distinct steps; image acquisition, image processing (from point cloud to mesh) and randomized visual restitution.
_ Image acquisition: As described on fig. 1, scAssemblage uses a Kinect to store depth image and proceeds to object tracking by thresholding using grayscaled depth information.
_ Image processing: Once thresholded and “tracked”, the object on display is scanned and turned into a 3D point cloud (cloud of 3 dimensional pixels (voxels) using their x, y, z coordinates). This point cloud is stored into a mesh, itself stored into a vector. This function is facilitated through the use of an oscTouch controller allowing people to scan and store the desired tracked object into the vector of mesh.
_ Randomized restitution: Meanwhile adding distinct meshes from the point clouds to the vector of meshes, each mesh is randomly allocated a x, y, z axis and displayed on screen, revealing what I refer to as “an assemblage.” Using the “save” function enables the user to export her “assemblage” as “.ply” models, readable by software like meshLab to be later turned into CGI models.

Further development

_ It is important for the next machines to be more connected to a specific gyneo-technology.
_ scAssemblage is currently available on Github (https://github.com/jppg92/scAssemblage) for collective improvement, suggestions.
_ I am working on its thingyverse file to turn its structure into an “.ai” laser cuttable file.
_ Using a Kinect is practical for large scale scanning like spaces, peoples or objects larger than 20 x 20 cm. I will dive into custom built equivalent (camera light sensors, or maybe trying with laser scanning) for smaller informations as Uterii’s objects are of, rather, small scale (like tiny organs).
_ The interface for now is “transparent”, but to achieve its informational aim and knowledge “transmitter” function, I will continue to work on it to make it more informative. I think here I might need to use processing as open frameworks seems relatively limited in term of UI.

Bibliography

a.     Theory
Jones, Meredith. “Expressive Surfaces: The Case of the Designer Vagina.” Theory, Culture & Society, vol. 34, no. 7–8, 2017, pp. 29–50.
MedeaTV. Lucy Suchman: Restoring Information’s Body - Remediations at the Human-Machine Interface. YouTube, https://www.youtube.com/watch?v=Z3I-ndAXYWg&t=1567s. Accessed 12 May 2019.
Plant, Sadie. “On the Matrix: Cyberfeminist Simulations.” The Cybercultures Reader, Eds David Bell and Barbara M. Kennedy, Routledge, 2000, p. pp.325-326.
Prentice, Rachel. “The Anatomy of a Surgical Simulation: The Mutual Articulation of Bodies in and through the Machine.” Social Studies of Science, vol. 35, no. 6, 2005, pp. 837–866.
Suchman, Lucille Alice. Human-Machine Reconfigurations. 2nd ed.., Cambridge University Press, 2007.
b.     Technic
Haggren, Henrik. Mapvision - The Photogrammetric Machine Vision System For Engineering Applications. Edited by Michael J. W. Chen and Robert H. Thibadeau, 1987, p. 210. DOI.org (Crossref), doi:10.1117/12.937877.
Hernández-López, José-Juan, et al. “Detecting Objects Using Color and Depth Segmentation with Kinect Sensor.” Procedia Technology, vol. 3, 2012, pp. 196–204. DOI.org (Crossref), doi:10.1016/j.protcy.2012.03.021.
Perevalov, Denis, and Igor Tatarnikov. Mastering OpenFrameworks: Creative Coding Demystified: A Practical Guide to Creating Audiovisual Interactive Projects with Low-Level Data Processing Using OpenFrameworks. Packt Publ, 2013.
---. Openframeworks Essentials. Packt Publishing, 2015.
Szeliski, Richard. Computer Vision. Springer London, 2011. DOI.org (Crossref), doi:10.1007/978-1-84882-935-0.