Yiğit, GülsümYigit, GulsumAmasyali, Mehmet Fatih2023-10-192023-10-19202231532-06261532-0634https://doi.org/10.1002/cpe.6775https://hdl.handle.net/20.500.12469/5598In this study, two GRU variants named GRU1 and GRU2 are proposed by employing simple changes to the internal structure of the standard GRU, which is one of the popular RNN variants. Comparative experiments are conducted on four problems: language modeling, question answering, addition task, and sentiment analysis. Moreover, in the addition task, curriculum learning and anti-curriculum learning strategies, which extend the training data having examples from easy to hard or from hard to easy, are comparatively evaluated. Accordingly, the GRU1 and GRU2 variants outperformed the standard GRU. In addition, the curriculum learning approach, in which the training data is expanded from easy to difficult, improves the performance considerably.eninfo:eu-repo/semantics/closedAccesscurriculum learninggated recurrent unitsrecurrent neural networksSeq2seqshort-term dependencyAssessing the impact of minor modifications on the interior structure of GRU: GRU1 and GRU2Article2034WOS:00072963010000110.1002/cpe.67752-s2.0-85121038213N/AQ2