Commit 3b6d84b7 authored by Rita-Josy Haddoub's avatar Rita-Josy Haddoub

Update README.md

parent 5ea48af0
......@@ -19,10 +19,15 @@ Project Adjustments to Covid-19 lockdown, and Lebanese Economic/Political/Social
Initially, I wanted to build my own network model which would compute the whole process of data input into latent sculpture. The output of the model would have been a step further from a reconstructed visualization as .JPG into an .STL file. The .STL file is a 3D render which is recognized by 3D printers. I was beginning to look into point clouds to automate this process through my model. And finally the .STL would have been rendered to concrete through a 3Dprinter.
I did not carry on to explore Tensorflow and configure my own GAN, but rather I trained my own model through pix2pix which has its’ pre-set properties. Using my knowledge of python from previous MachineLearning modules. With adjustment, I remained with my Data Input to physical Output idea. Instead of 3D printing a GAN into concrete, I went to a brick factory and molded a Beton myself following the meticulous designs generated by my model. The concept idea of rendering a neural network to concrete remains as intended, however the process of scultping myself involves more collaboration with the network, rather than giving the project full automation.
# Data
![IMG_0791__1_](/uploads/b7d5c7985ee2c6fde213e1e856093536/IMG_0791__1_.jpg)
![IMG_5812_2](/uploads/d2d0b813cdf2e8bfa34104695f047bf4/IMG_5812_2.jpg)
![PHOTO-2020-02-21-00-19-17](/uploads/5005ed19284fb8e6d29735fba11845fe/PHOTO-2020-02-21-00-19-17.jpg)
![IMG_1005](/uploads/d3100d035335d0800fe01aaac89741b8/IMG_1005.jpg)
![Screen_Shot_2020-06-20_at_10.33.26_PM](/uploads/0fb2606a2fe26eb1e8ebe275e8e33cd2/Screen_Shot_2020-06-20_at_10.33.26_PM.png)
The Beton Dataset is a collection of self-taken photos of beton spontaneously found throughout the city of Beirut and its outskirts.
Scraping images from large online databases is the conventional way to collect images to work with GANS. Deep learning goes hand in hand with big data, as it deals with extracting meaningful information from a vast amount of information. Memo Aktins Deep Mediations illustrates GAN’s as a powerful medium for creative expression. Where his image dataset consist of ‘everything’. All scraped from Flickr with tags such as ‘life, love, faith, everything, etc.’.
The power of GANs to learn the distribution of all of its input data, and to find forms, compositions, and patterns within its endless network of possible trajectories makes it a fascinating tool to understand big data better, and distinguishes it from other algorithms.
My raw image dataset currently consists of about only 40 original images. The model however has been trained on a total of 274 images, consisting of re-sized, cropped, rotated, and transformed adjustments from the original dataset. I then ran a batch process to rescale the images to 256x256. The method used for training is ‘inpainting’ from pix2pix-tensorflow.
Creating the Beton dataset manually was by choice firstly because data mining is restricted to digitized and transparent information, and the definition I see Beton in falls out of its categorization within construction-sites. The dataset grew as i came across different uses of Beton, and it was never by intention to search for them. I did also receive images by text from others who would come across one. Getting the data was a very material and lived-in process.
# Rendering a Neural Network to Concrete
......
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment