diff --git a/piano/README.md b/piano/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..07a90a14b5c5ea10bdf1bc28de3b2e20650e3716
--- /dev/null
+++ b/piano/README.md
@@ -0,0 +1,70 @@
+# Piano gestures dataset
+
+This dataset contains multimodal recordings of two professional pianists performing an excerpt from Robert Schumann's _Träumerei_ (Kinderszenen Op.15 No.7) with different variations in speed execution and expressive intention.
+
+This dataset is reported in the following paper:
+Álvaro Sarasúa, Baptiste Caramiaux, Atau Tanaka, and Miguel Ortiz (2017). __Datasets for the Analysis of Expressive Musical Gestures__. In _Proceedings of the 4th International Conference on Movement Computing, London, UK, June 2017 (MOCO'17)_.
+
+<!-- MarkdownTOC autolink="true" autoanchor="true" bracket="round"-->
+
+- [Description](#description)
+    - [Summary of conditions](#summary-of-conditions)
+- [Dataset structure](#dataset-structure)
+    - [Take data](#take-data)
+- [License](#license)
+- [Contact](#contact)
+
+<!-- /MarkdownTOC -->
+
+<a name="description"></a>
+## Description
+Each pianist played the excerpt at 3 different tempi, with and without a metronome: normal (70 beats per minute), slow (40 bpm) and fast (120 bpm). In the no-metronome condition, they also played with _rubato_ (continuous expressive tempo alteration).
+In each of the conditions for metronome and speed, they played with 5 expressive intentions: normal, still (trying to move as little as possible), exaggerated, finger _legato_ (melodic consecutive notes smoothly connected) and _staccato_ (detached  consecutive notes). __3 takes__ were recorded in each of the conditions, making a total of 105 takes per pianist.
+
+<a name="summary-of-conditions"></a>
+### Summary of conditions
+
+- __Metronome__
+    + With metronome
+    + Without metronome
+- __Tempo__
+    + Normal
+    + Fast
+    + Slow
+    + Rubato (only withour metronome)
+- __Expressive intention__
+    + Normal
+    + Still
+    + Exaggerated
+    + Finger _legato_
+    + Finger _staccato_
+
+<a name="dataset-structure"></a>
+## Dataset structure
+The dataset is organized in two separate folders for each of the pianists ("Pianist_01" and "Pianist_02"). Inside each pianist folder, there is one folder for each take, named as:
+
+`<METRONOME>_<TEMPO>_<EXPRESSIVE_INTENTION>_<TAKE_NR>`
+
+For example, the data for the 2nd take with metronome, at normal speed and exaggerated expression is located at `METRONOME_TEMPO_EXAG_2`.
+
+<a name="take-data"></a>
+### Take data
+The following data is available for every take:
+
+- __Audio__
+ `audio.wav`
+- __Video__
+ `video.mov`
+- __MIDI__
+ `score.mid`
+- __Motion Capture__
+ Position and orientation of 22 body limbs. Due to occlusions, the position of body limbs below the abdomen is unstable.
+
+
+<a name="license"></a>
+## License
+--License--
+
+<a name="contact"></a>
+## Contact
+--Contact info--
\ No newline at end of file