Weight Exchange in Distributed Learning

Loading...
Publication Logo

Date

2016

Authors

Dorner, Julian
Favrichon, Samuel
Öğrenci, Arif Selçuk

Journal Title

Journal ISSN

Volume Title

Publisher

IEEE

Open Access Color

Green Open Access

Yes

OpenAIRE Downloads

OpenAIRE Views

Publicly Funded

No
Impulse
Average
Influence
Average
Popularity
Average

Research Projects

Journal Issue

Abstract

Neural networks may allow different organisations to extract knowledge from the data they collect about a similar problem domain. Moreover learning algorithms usually benefit from being able to use more training instances. But the parties owning the data are not always keen on sharing it. We propose a way to implement distributed learning to improve the performance of neural networks without sharing the actual data among different organisations. This paper deals with the alternative methods of determining the weight exchange mechanisms among nodes. The key is to implement the epochs of learning separately at each node and then to select the best weight set among the different neural networks and to publish them to each node. The results show that an increase in performance can be achieved by deploying simple methods for weight exchange.

Description

Keywords

N/A

Fields of Science

0202 electrical engineering, electronic engineering, information engineering, 02 engineering and technology

Citation

WoS Q

Scopus Q

OpenCitations Logo
OpenCitations Citation Count
N/A

Source

2016 International Joint Conference on Neural Networks (IJCNN)

Volume

Issue

Start Page

3081

End Page

3084
PlumX Metrics
Citations

Scopus : 1

Captures

Mendeley Readers : 4

SCOPUS™ Citations

1

checked on Feb 11, 2026

Page Views

4

checked on Feb 11, 2026

Downloads

86

checked on Feb 11, 2026

Google Scholar Logo
Google Scholar™
OpenAlex Logo
OpenAlex FWCI
0.0

Sustainable Development Goals

SDG data is not available