CRAN Package Check Results for Package spinner

Last updated on 2025-04-28 00:50:37 CEST.

Flavor Version Tinstall Tcheck Ttotal Status Flags
r-devel-linux-x86_64-debian-clang 1.1.0 15.02 1101.05 1116.07 OK
r-devel-linux-x86_64-debian-gcc 1.1.0 10.04 1153.35 1163.39 OK
r-devel-linux-x86_64-fedora-clang 1.1.0 1168.92 OK
r-devel-linux-x86_64-fedora-gcc 1.1.0 1285.45 OK
r-devel-windows-x86_64 1.1.0 16.00 379.00 395.00 OK
r-patched-linux-x86_64 1.1.0 13.94 1185.09 1199.03 OK
r-release-linux-x86_64 1.1.0 12.80 1084.27 1097.07 OK
r-release-macos-arm64 1.1.0 280.00 OK
r-release-macos-x86_64 1.1.0 70.00 OK
r-release-windows-x86_64 1.1.0 18.00 248.00 266.00 ERROR
r-oldrel-macos-arm64 1.1.0 44.00 OK
r-oldrel-macos-x86_64 1.1.0 73.00 OK
r-oldrel-windows-x86_64 1.1.0 23.00 584.00 607.00 OK

Check Details

Version: 1.1.0
Check: tests
Result: ERROR Running 'testthat.R' [139s] Running the tests in 'tests/testthat.R' failed. Complete output: > # This file is part of the standard setup for testthat. > # It is recommended that you do not modify it. > # > # Where should you do additional test configuration? > # Learn more about the roles of various files in: > # * https://r-pkgs.org/tests.html > # * https://testthat.r-lib.org/reference/test_package.html#special-files > > library(testthat) > library(spinner) > > test_check("spinner") OMP: Warning #96: Cannot form a team with 48 threads, using 2 instead. OMP: Hint Consider unsetting KMP_DEVICE_THREAD_LIMIT (KMP_ALL_THREADS), KMP_TEAMS_THREAD_LIMIT, and OMP_THREAD_LIMIT (if any are set). epoch: 10 Train loss: 0.7317824 Val loss: 0.7621946 epoch: 20 Train loss: 0.6322789 Val loss: 0.7653031 epoch: 30 Train loss: 0.7221151 Val loss: 0.8039527 early stop at epoch: 34 Train loss: 0.6300904 Val loss: 0.756388 epoch: 10 Train loss: 0.7175471 Val loss: 0.5811712 epoch: 20 Train loss: 0.6909837 Val loss: 0.5899688 epoch: 30 Train loss: 0.6618134 Val loss: 0.6638206 epoch: 40 Train loss: 0.6234404 Val loss: 0.6321722 epoch: 50 Train loss: 0.6612909 Val loss: 0.5771581 early stop at epoch: 54 Train loss: 0.7050608 Val loss: 0.6041846 epoch: 10 Train loss: 0.654511 Val loss: 0.3184479 epoch: 20 Train loss: 0.7061145 Val loss: 0.4157068 epoch: 30 Train loss: 0.7505819 Val loss: 0.5504788 epoch: 40 Train loss: 0.5440403 Val loss: 0.5311794 epoch: 50 Train loss: 0.6966549 Val loss: 0.6133323 early stop at epoch: 55 Train loss: 0.803764 Val loss: 0.5741208 epoch: 10 Train loss: 0.5628225 Val loss: 0.3788545 epoch: 20 Train loss: 0.66157 Val loss: 0.797702 epoch: 30 Train loss: 0.5826113 Val loss: 0.7785869 early stop at epoch: 30 Train loss: 0.5826113 Val loss: 0.7785869 time: 35.16 sec elapsed epoch: 10 Train loss: 0.7409889 Val loss: 0.7343982 epoch: 20 Train loss: 0.6867293 Val loss: 0.7120999 epoch: 30 Train loss: 0.6315681 Val loss: 0.7207493 early stop at epoch: 31 Train loss: 0.5204234 Val loss: 0.777499 epoch: 10 Train loss: 0.6564538 Val loss: 0.7517146 epoch: 20 Train loss: 0.5716132 Val loss: 0.7110354 epoch: 30 Train loss: 0.6047138 Val loss: 0.7200903 early stop at epoch: 30 Train loss: 0.6047138 Val loss: 0.7200903 epoch: 10 Train loss: 0.7298493 Val loss: 0.5944011 epoch: 20 Train loss: 0.701436 Val loss: 0.7832389 epoch: 30 Train loss: 0.6641642 Val loss: 0.6373111 early stop at epoch: 34 Train loss: 0.5590758 Val loss: 0.7329341 epoch: 10 Train loss: 0.6905536 Val loss: 0.6138876 epoch: 20 Train loss: 0.6715042 Val loss: 0.722386 epoch: 30 Train loss: 0.694303 Val loss: 0.7603433 early stop at epoch: 33 Train loss: 0.7030374 Val loss: 0.7828748 time: 25.46 sec elapsed epoch: 10 Train loss: 0.3249352 Val loss: 0.3522149 epoch: 20 Train loss: 0.3366883 Val loss: 0.2461161 epoch: 30 Train loss: 0.3517227 Val loss: 0.258471 early stop at epoch: 31 Train loss: 0.2865163 Val loss: 0.5111788 epoch: 10 Train loss: 0.335785 Val loss: 0.4863305 epoch: 20 Train loss: 0.3319254 Val loss: 0.3984242 epoch: 30 Train loss: 0.338305 Val loss: 0.2768511 early stop at epoch: 32 Train loss: 0.3111433 Val loss: 0.4069659 epoch: 10 Train loss: 0.2867069 Val loss: 0.4243255 epoch: 20 Train loss: 0.2477084 Val loss: 0.4297781 epoch: 30 Train loss: 0.25424 Val loss: 0.3658085 early stop at epoch: 30 Train loss: 0.25424 Val loss: 0.3658085 epoch: 10 Train loss: 0.2299772 Val loss: 0.3072349 epoch: 20 Train loss: 0.2680575 Val loss: 0.169904 epoch: 30 Train loss: 0.2679525 Val loss: 0.1344198 early stop at epoch: 38 Train loss: 0.2458126 Val loss: 0.3700171 time: 26.53 sec elapsed epoch: 10 Train loss: 0.5702409 Val loss: 0.4774213 epoch: 20 Train loss: 0.5702409 Val loss: 0.4765813 epoch: 30 Train loss: 0.5702409 Val loss: 0.4765813 epoch: 40 Train loss: 0.5702409 Val loss: 0.4765813 epoch: 50 Train loss: 0.5702409 Val loss: 0.4765813 epoch: 60 Train loss: 0.5702409 Val loss: 0.4765813 epoch: 70 Train loss: 0.5702409 Val loss: 0.4765813 epoch: 80 Train loss: 0.5702409 Val loss: 0.4765813 epoch: 90 Train loss: 0.5702409 Val loss: 0.4765813 epoch: 100 Train loss: 0.5702409 Val loss: 0.4765813 epoch: 10 Train loss: 0.4554161 Val loss: 0.6724439 epoch: 20 Train loss: 0.4554161 Val loss: 0.6724439 epoch: 30 Train loss: 0.4554161 Val loss: 0.6724439 early stop at epoch: 31 Train loss: 0.4554161 Val loss: 0.7056807 epoch: 10 Train loss: 0.5752563 Val loss: 0.6791061 epoch: 20 Train loss: 0.5752563 Val loss: 0.6130908 epoch: 30 Train loss: 0.5752563 Val loss: 0.6069332 early stop at epoch: 37 Train loss: 0.5752563 Val loss: 0.6567641 time: 13.96 sec elapsed epoch: 10 Train loss: 0.5652817 Val loss: 0.5945494 epoch: 20 Train loss: 0.5766739 Val loss: 0.6136201 epoch: 30 Train loss: 0.5766739 Val loss: 0.6142852 early stop at epoch: 32 Train loss: 0.5766739 Val loss: 0.6047716 epoch: 10 Train loss: 0.7769034 Val loss: 0.6501483 epoch: 20 Train loss: 0.7769034 Val loss: 0.6713287 epoch: 30 Train loss: 0.7769034 Val loss: 0.6501483 epoch: 40 Train loss: 0.8044059 Val loss: 0.6763625 early stop at epoch: 40 Train loss: 0.8044059 Val loss: 0.6763625 epoch: 10 Train loss: 0.6855377 Val loss: 0.6947392 epoch: 20 Train loss: 0.6770796 Val loss: 0.6942887 epoch: 30 Train loss: 0.6759621 Val loss: 0.7478956 early stop at epoch: 30 Train loss: 0.6759621 Val loss: 0.7478956 time: 10.37 sec elapsed epoch: 10 Train loss: 0.5610423 Val loss: 0.6719236 epoch: 20 Train loss: 0.603376 Val loss: 0.7066973 epoch: 30 Train loss: 0.585977 Val loss: 0.6936437 early stop at epoch: 34 Train loss: 0.5812439 Val loss: 0.7304418 epoch: 10 Train loss: 0.4583791 Val loss: 0.4016303 epoch: 20 Train loss: 0.1186387 Val loss: 0.5465474 epoch: 30 Train loss: 0.4981215 Val loss: 0.5338627 epoch: 40 Train loss: 0.4663077 Val loss: 0.364223 early stop at epoch: 43 Train loss: 0.1277899 Val loss: 0.5148605 epoch: 10 Train loss: 0.3682945 Val loss: 0.4483344 epoch: 20 Train loss: 0.4255118 Val loss: 0.3754213 epoch: 30 Train loss: 0.4197976 Val loss: 0.4472946 early stop at epoch: 33 Train loss: 0.4434004 Val loss: 0.5070518 time: 20.74 sec elapsed random search: 45.09 sec elapsed [ FAIL 1 | WARN 66 | SKIP 0 | PASS 46 ] ══ Failed tests ════════════════════════════════════════════════════════════════ ── Error ('test.R:89:13'): Correct outcome format and size for base outcome3 ─── <purrr_error_indexed/rlang_error/error/condition> Error in `purrr::pmap(hyper_params, ~spinner(graph, target, node_labels, edge_labels, context_labels, direction = ..1, sampling = NA, threshold = 0.01, method = ..2, node_embedding_size = ..13, edge_embedding_size = ..14, context_embedding_size = ..15, update_order = ..3, n_layers = ..4, skip_shortcut = ..5, forward_layer = ..6, forward_activation = ..7, forward_drop = ..8, mode = ..9, optimization = ..10, epochs, lr = ..11, patience, weight_decay = ..12, reps, folds, holdout, verbose, seed))`: i In index: 1. Caused by error in `pmap()`: i In index: 1. Caused by error in `training_function()`: ! not enough data for training [ FAIL 1 | WARN 66 | SKIP 0 | PASS 46 ] Error: Test failures Execution halted Flavor: r-release-windows-x86_64