Category Archives: news

Parallel computing group news

Vacancy: Postdoc/researcher position in parallel computing for machine learning

The University of Turin and the Parallel Programming research group

The University of Turin (UNITO) is one of the oldest, largest, and most prestigious Italian Universities. UNITO has about 70 000 students, 4000 academic and technical staff, 1800 post-graduate and post-doctoral students. It is organised in 27 Departments.  It is deeply involved in scientific research and manages roughly 500 projects per year, both at the national and international level. The long record of participation of UNITO in the EU strategic research agenda results from 115 FP7 funded research projects, among which 33 UNITO-coordinated projects and 7 ERC grants (5 ERC as host institution), along with 68 H2020 funded projects. The CS department was founded in 1971 and nowadays has over 75 tenured academics and researchers, and over 50 postdoctoral research associates and PhD candidates, whose scientific activities cover all the main research areas in Computer Science.

Vacancy

The appointment will be in be in the “Parallel Programming” research group (http://alpha.di.unito.it) of the Computer Sciences Department. The group has research interests in parallel programming models and run-time systems. The key staff of the research group was involved in several EU projects, including EU FP7-STREP Paraphrase (2011-2014) and REPARA (2013-2016), EU H2020-RIA Rephrase (2015-2018), Toreador (2016-2019); the Optibike Fortissimo2 EU I4MS experiment (2017-2018), the beHAPI EU RISE (2017-2021), as well as EU networks of excellence HIPEAC (High Performance Embedded Architecture and Compilation),  EU Cost IC1305  Network for Sustainable Ultrascale Computing (Nesus, 2014-2018), EU Cost IC1406 High-Performance Modelling and Simulation for Big Data Applications (Chipset, 2015-2019). 

The successful applicant will work with two newly funded projects:  

  • HPC4AI: High-Performance Computing for Artificial Intelligence (2018-2020, EU FESR 2014-20, 4.5M€, 2 partners)
  • DeepHealth: Deep-Learning and HPC to Boost Biomedical Applications for Health (2019-2022, EU 2018-ICT-11, 12.8M€, 21 partners)

Topics:

  • Programming languages and frameworks for the distributed training of neural networks
  • Development of run-time systems for parallel computing (shared-memory, message-passing, SIMT, etc.)
  • Cloud engineering, Deployment-as-a-Service for distributed training and federated training

Requirements:

Applicant should have a PhD level qualification in parallel computing. Strong C/C++ coding skills and parallel programming experience are essential. Experience in Machine Learning and Machine Vision is welcome, even if the research profile is not centred on these research areas. Applicant should be capable of working under his/her own initiative and leading a small research team, so excellent communication and organisational skills are also required. Scientific paper publishing, participation in international project meetings, and presentation to conferences are an integral part of the work.

Conditions and salary:

We offer a full-time contract, a good working environment, a highly stimulating environment with state-of-the-art infrastructure, flexible hours, fulls support for conference participation. Residence in piedmont is required (housing in Turin is generally not very expensive).

Salary 30k€-40k€ per annum (around 1800€-2300€/month after taxes can be expected).

Application

Your application should include a letter of motivation (at most 1 page), covering your research interests and how they will support this project, a description of your PhD thesis (or relevant work experience), and evidence of independent self-motivated research and software development. Please highlight experience in collaborative projects and especially open source software development. Please include your CV, degrees and grades and any other relevant documents, and the earliest possible starting date. Please also include at least two contacts for further references.

The vacancy will remain open until a suitable candidate has been hired. Applications will be regularly reviewed and potential candidates will be contacted.

Enquiries about vacancy:

Name: Prof. Marco Aldinucci

Email: marco.aldinucci@di.unito.it

The city of Turin

Turin (Italian: Torino) is an important business and cultural centre in northern Italy, capital of the Piedmont region, located mainly on the left bank of the Po River, surrounded by the western Alpine arch. The population of the urban area is estimated to be 1.7 million inhabitants. The city has a rich culture and history and is known for its numerous art galleries, restaurants, churches, palaces, opera houses, piazzas, parks, gardens, theatres, libraries, museums and other venues. Turin is well known for its baroque, rococo, neo-classical, and Art Nouveau architecture. Much of the city’s public squares, castles, gardens and elegant palazzi such as Palazzo Madama, were built in the 16th and 18th century, after the capital of the Duchy of Savoy (later Kingdom of Sardinia) was moved to Turin from Chambery (nowadays France), as part of the urban expansion.

Turin is sometimes called the “cradle of Italian liberty”, for having been the birthplace and home of notable politicians and people who contributed to the Renaissance, such as Cavour, a leading figure in the movement toward Italian unification, and many of the protagonists of Italian political and social life in the 20th century, among the others, Antonio Gramsci, Piero Gobetti, and Palmiro Togliatti. The city used to be a major European political centre, being Italy’s first capital in 1861 and being home to the House of Savoy, the Italian royal family. The city currently hosts some of the best Italian universities, colleges, academies, lycea and gymnasia, such as the six-century-old University of Turin and the Turin Polytechnic. Prestigious and important museums, such as the Egyptian Museum and the Mole Antonelliana are also found in the city. Turin’s several monuments and sights make it one of the world’s top 250 tourist destinations. Turin is also well known as the home of the Shroud of Turin, the football teams Juventus F.C. and Turin F.C., the headquarters of automobile manufacturers FIAT, Lancia and Alfa Romeo, and as host of the 2006 Winter Olympics. Several International Space Station modules, such as Harmony, Tranquility, and Columbus, were also manufactured at the Thales Alenia Space factory in Turin.

Turin is the only Italian city enlisted in the New York Times “52 Places to Go in 2016” guide: http:// www.nytimes.com/interactive/2016/01/07/travel/places-to-visit.html 

Piedmont is the first ranked “Best in travel 2019” venue of the Lonely Planet guide. https://www.lonelyplanet.com/italy/liguria-piedmont-and-valle-daosta/piedmont

Maurizio’s Poster awarded at PNNL PoGo Symposium

On Aug. 21, Discovery Hall at PNNL hosted the 10th Annual Post Graduate Research Symposium. There Post Graduates have the opportunity to present on their work. Judges score both oral and poster presentations in each degree category (i.e., Post Bachelors, Post Masters and Post Doctorate).

Bo Peng earned the Post Doctorate Top Oral Presentation for the Impact of Secondary Phase on SCR Catalyst Durability. Maurizio Drocco claimed top prize of the Post Doctorate Top Poster Presentation for his Dynamic Allocation Schemas in Task-Based Runtimes. Kuan-Ting Lin won in the Post Masters Posters with his Renewable Polymer Precursor(s) from Acetaldehyde CondensationHope Lackey won in the Post Bachelor Posters with his Elemental Separation of Fourteen Lanthanides via Isotachophoresis in a Microfluidic Device.

Past and present Linus Pauling fellows presented their research at the 2018 Symposium.
Terms of Use: Our images are freely and publicly available for use with the credit line, “Andrea Starr | Pacific Northwest National Laboratory”; Please use provided caption information for use in appropriate context.

13M€ EU H2020 DeepHealth project is heading to kick-off

DeepHealth:  Deep-Learning and HPC to Boost Biomedical Applications for Health

EU H2020 ICT-11-2018 Innovation Action – Total cost 12.8M€, 2019-2021 (36 months)

The Department of Computer Science of the University of Turin, together with the departments of neuroscience and medical sciences of the University of Turin, the consortium of public hospitals in the city of Turin (city of science and health) and 20 other European academic and industrial partners they are joining forces to push the methods of Artificial Intelligence, supported by High-Performance Computing, towards superhuman precision levels to support diagnosis, monitoring and cure of diseases.

As parallel computing reserach group, we will support deephealth with a novel platform for the training of deep networks with medical images. The platform will be novel, high-performance, easy to use, fully open and available for use to the whole national reserach community by way of the HPC cloud ecosystem at  HPC4AI (Competence Center for High Performance and Artificial Intelligence Turin).

Excited.


DeepHealth:  Deep-Learning and HPC to Boost Biomedical Applications for Health

Health scientific discovery and innovation are expected to quickly move forward under the so-called “fourth paradigm of science”, which relies on unifying the traditionally separated and heterogeneous high-performance computing and big data analytics environments.
Under this paradigm, the DeepHealth project will provide HPC computing power at the service of biomedical applications; and apply Deep Learning (DL) techniques on large and complex biomedical datasets to support new and more efficient ways of diagnosis, monitoring and treatment of diseases.

DeepHealth will develop a flexible and scalable framework for the HPC + Big Data environment, based on two new libraries: the European Distributed Deep Learning Library (EDDLL) and the European Computer Vision Library (ECVL). The framework will be validated in 14 use cases which will allow to train models and provide training data from different medical areas (migraine, dementia, depression, etc.). The resulting trained models, and the libraries will be integrated and validated in 7 existing biomedical software platforms, which include: a) commercial platforms (e.g. PHILIPS Clinical Decision Support System from or THALES PIAF; and b) research-oriented platforms (e.g. CEA`s ExpressIFTM or CRS4`s Digital Pathology). The impact is measured by tracking the time-to-model-in-production (ttmip).

Through this approach, DeepHealth will also standardise HPC resources to the needs of DL applications, and underpin the compatibility and uniformity on the set of tools used by medical staff and expert users. The final DeepHealth solution will be compatible with HPC infrastructures ranging from the ones in supercomputing centres to the ones in hospitals.
DeepHealth involves 21 partners from 9 European Countries, gathering a multidisciplinary group from research organisations (9), health organisations (4) as well as (4) large and (4) SME industrial partners, with a strong commitment towards innovation, exploitation and sustainability.

Maurizio got his PhD he is now postdoctoral researcher at Pacific Northwest National Laboratory (PNNL)

Maurizio Drocco got his PhD in Computer Science at University of Torino on October 2017 defending his thesis entitled “Parallel Programming with Global Asynchronous Memory: Models, C++ APIs and Implementations” and tomorrow will start a new adventure as a Postdoctoral researcher at Pacific Northwest National Laboratory (PNNL), WA, USA.

Congratulations Maurizio. It has been a pleasure working with you for the past 4 years.

M. Drocco, “Parallel Programming with Global Asynchronous Memory: Models, C++ APIs and Implementations,” PhD Thesis, 2017. doi:10.5281/zenodo.1037585

Viva snapshot

Viva committee: Prof. Jose Daniel Garcia Sanchez (UC3M, Madrid, Spain), Prof. Marco Danelutto Damiani (Universi†à di Pisa, Italy), Prof.Siegfried Benkner ( University of Vienna, Austria).

Continue reading

Claudia got her PhD on data analytics and she is now Scientist at IBM TJ Watson

Claudia Misale got her PhD in Computer Science at University of Torino on May 11, 2017 defending her thesis entitled “PiCo: A Domain-Specific Language for Data Analytics Pipelines” 

In her thesis Claudia reviews and analyses the state of the art frameworks for data analytics and proposing a methodology to compare their expressiveness and advocates the design of a novel C++ DSL for big data analytics (so-called PiCo: Pipeline Composition). PiCo, differently from Spark/Flink/etc is fully polymorphic and exhibit a clear separation between data and transformations. This, together with the careful C++/Fastflow implementation, eases the application development since data scientists can play with the pipelining of different transformations without any need to adapt the data type (and its memory layout). Type is inferred along the transformation pipeline in a “fluent” programming fashion. The clear separation between transformation, data type and its layout in memory make it possible to really optimise data movements, memory usage and eventually performance. Application developed with PiCo exhibit a huge 10x reduced memory footprint against Spark/Flink equivalent. The fully C++/Fastflow run-time support make it possible to really generate the network of run-time support processes from data processing pipeline, thus achieving the maximum scalability imposed by true data dependencies (much beyond the simple master-worker paradigm of Spark and Flink). Being PiCo C++11/14, it is already open to host native GPU offloading, which paves the way for the analytics-machine learning converge. See more at  DOI:10.5281/zenodo.579753

Claudia is flying today to New York to start her career as a scientist at IBM TJ Watson research center within the Data-Centric Systems Solutions.

Congratulations Claudia. It has been a pleasure working with you for the past 4 years.

Viva snaphots

Claudia-thesis

Claudia viva talk

Claudia-thesis

Viva committee: Prof. Jose Daniel Garcia Sanchez (UC3M, Madrid, Spain), prof. Ernesto Damiani (Khalifa University, Abu Dhabi, UAE), Prof. Domenico Talia (Università della Calabria, Italy)

 

 

 

Continue reading

OptiBike experiment funded under Fortissimo2 EU I4MS

Our project OptiBike (total cost 230K €) has been incorporated into the Fortissimo2 EU @i4ms project (n. 680481). Also this is the first EU-funded project at  and at HPC  laboratory of ICxT@UNITO.

We are looking forward for the kick-off.

 

Maximum loads and principal directions in Global Stiffness Load case.

Maximum loads and principal directions in Global Stiffness Load case

OptiBike: Robust Lightweight Composite Bicycle design and optimization

In the current design process for composite materials, the effect of manufacturing uncertainty on the structural and dynamic performances of mechanical structures (aeronautic, automotive and others) cannot be accurately assessed due to the limitations of the current computational resources and are hence compensated for by applying safety factors. This non ideal situation usually leads to overdesigned structures that could potentially meet higher performances with the same safety levels.
The objective of this experiment is to establish a design workflow and service for composite material modelling, simulation and numerical optimization that uses uncertainty quantification and HPC to deliver high performance and reliable composite material products. This design workflow can be applied to any composite material structure, from aeronautics to bicycles, and enables SMEs an easy to use and hassle-free access to advanced material design methodologies and the relative HPC infrastructures that allow the application of reliability based design optimization (RBDO). To demonstrate this design workflow, a full composite bicycle will be designed and optimized based on real manufacturing data provided by IDEC, a Spanish SME that provides unique design and manufacturing capabilities for composite materials.

The expected results of this experiment are:

  1. a design optimization approach for composite materials by means of uncertainty quantification techniques for reliability and robustness;
  2. a design optimization service that can be used by every user to perform reliability based product optimization (this will be demonstrated for the bicycle case of IDEC).

Partners

  • Noesis solution, Belgium (coord)
  • IDEC, Spain
  • University of Torino (Alpha, ICxT and C3S), Italy
  • Arctur, Slovenia

 

C3S@UNITO will use INDIGO software tools

Adopting INDIGO-Data Cloud: the Scientific Computing Competence Centre of the University of Torino will use INDIGO software tools

The INDIGO-Data Cloud project is happy to announce that it signed a Memorandum of Understanding (MoU) with the Scientific Computing Competence Centre of the University of Torino (C3S).

A Memorandum of Understanding has just been signed.

The INDIGO-Data Cloud project is happy to announce that it signed a Memorandum of Understanding (MoU) with the Scientific Computing Competence Centre of the University of Torino (C3S). C3S is a research centre that focuses on scientific computing technology and applications and manages OCCAM, • Open Computing Cluster for Advanced data Manipulation • a multipurpose HPC Cluster. Thanks to this agreement, a collaboration has been set-up between C3S and the INDIGO-DataCloud project for the use and development of advanced tools for scientific computing, particularly for heterogeneous use-case management in an HPC infrastructure context.

C3S will have access to software tools developed by INDIGO-DataCloud and will be able to integrate them in the management layer of the OCCAM supercomputer. The INDIGO teams collaborate with C3S in adapting and porting the tools to the specific use cases, giving support on a best-effort basis and providing, whenever feasible, patches and customisations for its software products.

“INDIGO-DataCloud aims at providing services that can be deployed on different computing platforms and enable the interoperability of heterogeneous e-infrastructures. C3S is a very interesting opportunity to test such capabilities and prove how our tools can really make the difference, providing seamless access, elasticity and scalability for the exploitation of data and computational resources” • says Giacinto Donvito, the Technical Director of the INDIGO-DataCloud Project.

“We have a very wide variety of use cases, from traditional HPC in computational chemistry, physics and astrophysics to data-intensive genomics and computational biology all the way to social sciences and even the humanities, so we will have to use the best tools to accommodate them all. We trust that many INDIGO products will help us to improve the performance and usability of our centre” says Matteo Sereno, professor at the Department of Computer Science of the University of Torino and C3S Director.

http://cordis.europa.eu/news/rcn/138515_en.html

Technical information in:

M. Aldinucci, S. Bagnasco, S. Lusso, P. Pasteris, S. Vallero, and S. Rabellino, “The Open Computing Cluster for Advanced data Manipulation (OCCAM),” in The 22nd International Conference on Computing in High Energy and Nuclear Physics (CHEP), San Francisco, USA, 2016.

Ricerca, innovazione e formazione al tempo dei Big Data

Ricerca, innovazione e formazione al tempo dei Big Data

Lo sviluppo tumultuoso delle tecnologie digitali e l’affermarsi di Internet come mondo virtuale che diventa sempre più integrato alla quotidianità hanno esponenzialmente accresciuto la disponibilità d’informazione in tutti i campi del sapere umano. In questo contesto, la scienza dei dati (o data science) si propone come nuovo ambito di conoscenza intrinsecamente interdisciplinare, capace di trasformare in profondità le modalità e l’impatto della ricerca universitaria.

L’Università degli Studi di Torino si è attrezzata velocemente per questa sfida epocale e vuole presentare pubblicamente la propria capacità innovativa sul piano teorico, della ricerca applicata e della formazione, spaziando dagli algoritmi ai big data, dall’Internet of Things al machine learning.

Grazie al supporto del Collegio Carlo Alberto e alla Compagnia di San Paolo è stato possibile organizzare una giornata seminariale in cui ricercatori e docenti divulgano alcune delle ricerche più avanzate e si confrontano sulle opportunità (e i rischi) che nuovi approcci offrono alla scienza, alla cultura, alla società e all’economia.

PROGRAMMA

9:30      Saluti istituzionali

Gianmaria Ajani – Magnifico Rettore dell’Università degli Studi diTorino

Francesco Profumo- Presidente della Compagnia di San Paolo

10:00    Introduzione. Scienza dei dati epiattaforme abilitanti

Marco Guerzoni (Dip. di Economia e Statistica e Despina BigData Lab)

Marco Aldinucci (Dip. di Informatica e C3S)

10:15    La rivoluzione dei big data nelle scienze della vita

Moderatore: Paolo Provero (Dipartimento di BiotecnologieMolecolari e Scienze per la salute)

–       La rivoluzione digitale e la medicina personalizzata inoncologia

Enzo Medico (Dip. di Oncologia)

–       Big Data e genomica vegetale

Alberto Acquadro (Dipartimento di Scienze Agrarie, Forestali eAlimentari)

11:00    La complessità svelata: big data perl’economia e l’impresa

Moderatore: Magda Fontana (Dip. di Economia e Statistica eDespina Big Data Lab)

–       Il grado di verità dei dati nei processi decisionali

Elvira Di Nardo (Dip. di Matematica)

–       Analisi predittive e comportamento dei consumatori

Massimiliano Nuccio (Dip. di Economia e Statistica e DespinaBig Data Lab)

–       Le sfide normative dei Big Data tra standards giuridici eflussi transfrontalieri

Alberto Oddenino (Dip. di Giurisprudenza)

11.50      Data Science: strumenti e modelli perreti e applicazioni

Moderatore: Filippo Barbera (Dip. Culture Politica e Società)

–       A cosa servono i modelli matematici: il caso dello studiodelle reti

Laura Sacerdote (Dip. di Matematica)

–       Modelli statistici per reti: applicazioni alle neuroscienze

Antonio Canale (Dip. di Scienze economico-sociali ematematico-statistiche)

–       La scienza delle reti: dai dati alle applicazioni

Giancarlo Ruffo (Dip. di Informatica)

12.40    L’offerta didattica su Big Dataall’Università di Torino

Matteo Ruggiero (Dip. di Scienze economico-sociali ematematico-statistiche)

 

13:00-14:00 ——————Pranzo———————————–

14:00    Vita e salute

Moderatore: Lorenzo Richiardi (Dip. di Scienze Mediche)

–       Big data analitico-clinici per prevenzione e diagnosticamedica ad alta focalizzazione

Marco Vincenti (Dip. di Chimica)

–       Dal dato al modello in epidemiologia veterinaria

Mario Giacobini (Dip. di Scienze Veterinarie)

14.45    Società, cultura e cittá intelligenti

Moderatore: Guido Boella (Dip. di Informatica)

–       Mappe sensoriali e la nuova scienza delle città

Rossano Schifanella (Dip.di Informatica)

–       Fisica Sociale e Big Data al servizio del benessere deicittadini

Marcello Bogetti (Labnet)

–       Smart big data per le scienze umane

Vincenzo Lombardo (Dip. di Informatica)

–       Equità e sostenibilitá nei consumi energetici: unapproccio IoT

Vito Frontuto (Dip. di Economia e Statistica)

15.50    TAVOLA ROTONDA Data-drivenInnovation: l’ecosistema big data a Torino

Moderatore: Aldo Geuna (Dip. di Economia e Statistica eCollegio Carlo Alberto)

–      Valerio Cencig (Chief Data Officer – Intesa S. Paolo)

–      Stefano Gallo (Direttore ICT – Città della Salute e dellaScienza)

–      Roberto Moriondo (Direttore Generale -Comune di Novara)

–      Daniela Paolotti (Research Leader – Fondazione ISI)

–      Emilio Paolucci (BigData@Polito – Politecnico di Torino)

–      Gian Paolo Zanetta (Cittá della Salute e della Scienza)

16.45    TAVOLA ROTONDA Le istituzioni verso ibig data

Moderatore: Pietro Terna (Presidente – Collegio Carlo Alberto)

–      Giuseppina De Santis (Assessore alle Attività Produttive eall’Innovazione – Regione Piemonte)

–      Francesca Leon (Assessore alla Cultura – Comune di Torino)

–      Paola Pisano (Assessore all’Innovazione – Comune di Torino)

Una sessione di poster allestita all’ingresso presenterà alcune delle ricerche realizzate o in corso presso i diversi dipartimenti dell’Università di Torino

http://www.unitonews.it/index.php/en/event_detail/luniversita-di-torino-verso-il-futuro-ricerca-innovazione-e-formazione-al-tempo-dei-big-data

EU I4MS Digital Manufacturing HUB – 1st Workshop, Torino, Italy

1st Regional DIMA-HUB Workshop
November 3rd, 2016
Centro Congressi Unione Industriale di Torino

Don’t miss the HPC session (moderated by Marco Aldinucci) and CPS (moderated by Enrico Bini)

Register here


DIMA-HUB is a feasibility study positioned within the context of the mentoring and sponsorship programme of the I4MS initiative (Phase 2) aiming to extend the geographical coverage of the I4MS ecosystem by establishing a Regional Digital Innovation (RDMI) hub in the Piedmont Region, Italy. The mission of the hub is to foster and accelerate technology advances, while supporting start-ups and SMEs in their digital transformation path by connecting them with firms and competence centers within and from outside the Piedmont Region.

Themes

  • Advanced laser-based applications (including additive manufacturing)
  • Internet of Things (IoT)
  • Robotics, Cyber Physical Systems (CPS)
  • High Performance Computing (HPC) 

 

The consortium consists of five members, three of which have specialization in research and development on digital manufacturing and the remaining two are regional innovation clusters:

  1. Politecnico di Torino
  2. Università di Torino
  3. MESAP
  4. Torino Wireless (TOWL)
  5. Istituto Superiore Mario Boella (ISMB)

More information at DIMA-HUB home page

DIMA-HUB