Update to unified kth/0.78.0 Conan 2 package#3
Conversation
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughRemoves legacy CI scripts/workflows, adds a GitHub Actions CI pipeline with cibuildwheel + Conan2 integration, migrates Conan recipe and setup/build flow to Conan 2, changes Python C‑API bindings (rename/remove/add functions, BCH encoders), updates config structs/bindings, adds type stubs, and updates tests/examples and packaging metadata. Changes
sequenceDiagram
participant GH as "GitHub Actions"
participant Checkout as "Checkout"
participant Conan as "Conan (cache)"
participant CIBW as "cibuildwheel"
participant BuildEnv as "Build env (Python + compilers)"
participant WheelTest as "Per-wheel smoke tests"
participant Artifact as "Artifact store"
participant Publish as "PyPI / TestPyPI / Release"
GH->>Checkout: checkout full history
GH->>Conan: restore/cache ~/.conan2
GH->>BuildEnv: set env (KTH_MARCH_ID, KTH_CURRENCY, CIBW_ENVIRONMENT)
GH->>CIBW: run cibuildwheel (CIBW_BEFORE_ALL_* → conan install)
CIBW->>BuildEnv: invoke wheel build per Python/OS
BuildEnv->>Conan: run conan install / run_conan_install()
BuildEnv->>Artifact: emit wheel files
CIBW->>WheelTest: run per-wheel import and pytest
WheelTest->>Artifact: report results
GH->>Artifact: upload ./wheelhouse/*.whl / dist/*.tar.gz
GH->>Publish: on master/tag → publish artifacts (OIDC)
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 3
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (3)
src/module.c (3)
778-786:⚠️ Potential issue | 🟡 MinorSame pattern:
config_authorities_to_pydoesn't check for NULL elements.This has the same issue as
config_checkpoints_to_py- ifconfig_authority_to_pyfails,NULLis inserted into the list.Proposed fix
PyObject* config_authorities_to_py(kth_authority* authorities, size_t n) { PyObject* pyArr = PyList_New(n); + if (pyArr == NULL) return NULL; for (size_t i = 0; i < n; ++i) { PyObject* elem = config_authority_to_py(authorities); + if (elem == NULL) { + Py_DECREF(pyArr); + return NULL; + } PyList_SetItem(pyArr, i, elem); ++authorities; } return pyArr; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/module.c` around lines 778 - 786, In config_authorities_to_py, ensure you handle allocation failures: check the result of PyList_New and each call to config_authority_to_py (the local variable elem) for NULL; if PyList_New returns NULL return NULL immediately, and if any elem is NULL, DECREF the partially-built pyArr (Py_DECREF(pyArr)) and return NULL instead of inserting NULL into the list; use PyList_SetItem as before (it steals the reference) only after verifying elem is non-NULL.
798-806:⚠️ Potential issue | 🟡 MinorSame pattern:
config_endpoints_to_pydoesn't check for NULL elements.Proposed fix
PyObject* config_endpoints_to_py(kth_endpoint* endpoints, size_t n) { PyObject* pyArr = PyList_New(n); + if (pyArr == NULL) return NULL; for (size_t i = 0; i < n; ++i) { PyObject* elem = config_endpoint_to_py(endpoints); + if (elem == NULL) { + Py_DECREF(pyArr); + return NULL; + } PyList_SetItem(pyArr, i, elem); ++endpoints; } return pyArr; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/module.c` around lines 798 - 806, The function config_endpoints_to_py currently ignores NULL returns from config_endpoint_to_py and doesn't handle allocation failures; update it to check that PyList_New(n) returns non-NULL, then for each endpoint call config_endpoint_to_py(endpoints) and if elem is NULL clean up by Py_DECREF(pyArr) and return NULL, otherwise use PyList_SetItem(pyArr, i, elem) (remember it steals the reference) and continue; also advance the endpoints pointer as now. Ensure any early exit decrefs pyArr to avoid leaks and return NULL on error.
613-621:⚠️ Potential issue | 🟡 Minor
PyList_SetItemreturn value not checked; NULL element not handled.If
config_checkpoint_to_pyreturnsNULL(e.g., due toKTH_TRY_SETATTRfailure), thatNULLis inserted into the list, andPyList_SetItem's error return is ignored. This could lead to a list containingNULLelements or a silent failure.Proposed fix
PyObject* config_checkpoints_to_py(kth_checkpoint* checkpoint, size_t n) { PyObject* pyArr = PyList_New(n); + if (pyArr == NULL) return NULL; for (size_t i = 0; i < n; ++i) { PyObject* elem = config_checkpoint_to_py(checkpoint); - PyList_SetItem(pyArr, i, elem); + if (elem == NULL) { + Py_DECREF(pyArr); + return NULL; + } + PyList_SetItem(pyArr, i, elem); // steals ref, no need to check return ++checkpoint; } return pyArr; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/module.c` around lines 613 - 621, In config_checkpoints_to_py, handle failures from config_checkpoint_to_py and PyList_SetItem: after calling config_checkpoint_to_py(checkpoint) check if elem is NULL and if so DECREF pyArr (Py_DECREF) and return NULL to propagate the error; after PyList_SetItem, check its return value and if it returns -1, Py_DECREF(elem) and Py_DECREF(pyArr) and return NULL; remember PyList_SetItem steals the reference on success so only DECREF elem on failure paths. This ensures no NULL elements are left in the list and Python errors are propagated correctly.
🧹 Nitpick comments (2)
tests/test_config.py (1)
8-9: Consider renaminghashparameter to avoid shadowing builtin.The parameter name
hashshadows Python's built-inhash()function. While harmless in this context, renaming tohash_bytesorblock_hashwould silence the linter.Proposed fix
-def hash_to_hex(hash): - return ''.join('{:02x}'.format(x) for x in hash[::-1]) +def hash_to_hex(hash_bytes): + return ''.join('{:02x}'.format(x) for x in hash_bytes[::-1])🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@tests/test_config.py` around lines 8 - 9, Rename the parameter of function hash_to_hex to avoid shadowing the built-in hash() (e.g., use hash_bytes or block_hash); update the function signature and all internal uses (the generator expression using hash[::-1]) to the new name so the logic remains the same and linter warnings are resolved.setup.py (1)
175-175: Specify explicit encoding for file reads.Both
open()calls lack explicit encoding. This can cause issues on systems where the default encoding isn't UTF-8.♻️ Proposed fix
-exec(open(os.path.join(HERE, 'version.py')).read()) +exec(open(os.path.join(HERE, 'version.py'), encoding='utf-8').read()) setup( name=PKG_NAME, version=__version__, # noqa: F821 — defined by exec(version.py) python_requires='>=3.8', - long_description=open(os.path.join(HERE, 'README.md')).read(), + long_description=open(os.path.join(HERE, 'README.md'), encoding='utf-8').read(),Also applies to: 181-181
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@setup.py` at line 175, The file uses exec(open(os.path.join(HERE, 'version.py')).read()) (and a second similar open call) without an explicit encoding; change both to open(..., encoding='utf-8') (preferably using a with statement) so the file is read with UTF-8 encoding and file handles are properly closed—locate the exec(open(...)) calls and add encoding='utf-8' (and wrap in with open(..., encoding='utf-8') as f: exec(f.read()) if you choose to adopt the context-manager pattern).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@setup.py`:
- Around line 111-126: The BuildExtCommand.run() appends GNU linker group flags
to ext.extra_link_args unconditionally, which breaks macOS; change run() so it
only adds ['-Wl,--start-group', *libs, '-Wl,--end-group'] when the platform uses
GNU/ELF linkers (e.g., sys.platform.startswith('linux') or platform.system() ==
'Linux'); leave Darwin-specific extra_link_args (the existing macOS override
around lines 172-173) untouched and ensure ext.extra_link_args concatenation
occurs conditionally so macOS/Darwin builds do not receive the
--start-group/--end-group flags.
- Around line 167-168: The build flags currently set via
extra_compile_args=['-std=c++23'] and extra_link_args=['-lstdc++'] are not
portable; modify the extension build setup to detect the compiler type/version
(e.g., by inspecting self.compiler.compiler_type or using
sysconfig.get_config_var and running a version probe in a custom build_ext or
before creating the Extension) and set compile flags accordingly: use
'-std=c++23' for GCC >=11 and Clang >=17, use '/std:c++23preview' for MSVC
(Visual Studio >=16.10), and fall back to '-std=c++20' for older GCC/Clang; keep
'-lstdc++' only when using Unix-like linkers and omit it for MSVC. Ensure you
update where extra_compile_args and extra_link_args are assigned (the Extension
construction or build_ext subclass) to apply these conditional flags.
In `@src/chain/transaction.cpp`:
- Around line 47-57: The Python example still calls the old two-argument API;
update the call site so it matches the C binding
kth_py_native_chain_transaction_factory_from_data which no longer accepts a
version parameter — replace transaction_factory_from_data(0, tx_buffer) with
transaction_factory_from_data(tx_buffer) so the Python call passes only the
tx_buffer.
---
Outside diff comments:
In `@src/module.c`:
- Around line 778-786: In config_authorities_to_py, ensure you handle allocation
failures: check the result of PyList_New and each call to config_authority_to_py
(the local variable elem) for NULL; if PyList_New returns NULL return NULL
immediately, and if any elem is NULL, DECREF the partially-built pyArr
(Py_DECREF(pyArr)) and return NULL instead of inserting NULL into the list; use
PyList_SetItem as before (it steals the reference) only after verifying elem is
non-NULL.
- Around line 798-806: The function config_endpoints_to_py currently ignores
NULL returns from config_endpoint_to_py and doesn't handle allocation failures;
update it to check that PyList_New(n) returns non-NULL, then for each endpoint
call config_endpoint_to_py(endpoints) and if elem is NULL clean up by
Py_DECREF(pyArr) and return NULL, otherwise use PyList_SetItem(pyArr, i, elem)
(remember it steals the reference) and continue; also advance the endpoints
pointer as now. Ensure any early exit decrefs pyArr to avoid leaks and return
NULL on error.
- Around line 613-621: In config_checkpoints_to_py, handle failures from
config_checkpoint_to_py and PyList_SetItem: after calling
config_checkpoint_to_py(checkpoint) check if elem is NULL and if so DECREF pyArr
(Py_DECREF) and return NULL to propagate the error; after PyList_SetItem, check
its return value and if it returns -1, Py_DECREF(elem) and Py_DECREF(pyArr) and
return NULL; remember PyList_SetItem steals the reference on success so only
DECREF elem on failure paths. This ensures no NULL elements are left in the list
and Python errors are propagated correctly.
---
Nitpick comments:
In `@setup.py`:
- Line 175: The file uses exec(open(os.path.join(HERE, 'version.py')).read())
(and a second similar open call) without an explicit encoding; change both to
open(..., encoding='utf-8') (preferably using a with statement) so the file is
read with UTF-8 encoding and file handles are properly closed—locate the
exec(open(...)) calls and add encoding='utf-8' (and wrap in with open(...,
encoding='utf-8') as f: exec(f.read()) if you choose to adopt the
context-manager pattern).
In `@tests/test_config.py`:
- Around line 8-9: Rename the parameter of function hash_to_hex to avoid
shadowing the built-in hash() (e.g., use hash_bytes or block_hash); update the
function signature and all internal uses (the generator expression using
hash[::-1]) to the new name so the logic remains the same and linter warnings
are resolved.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 23ed4c05-d925-4e1b-88b8-cbe91f805437
📒 Files selected for processing (31)
.appveyor.yml.gitignore.travis/entrypoint copy 2.sh.travis/entrypoint.sh.travis/entrypoint_debian.shAUTHORS__init__.pybuild_and_upload_pkg.batconanfile.pyinclude/kth/py-native/chain/payment_address.hinclude/kth/py-native/chain/script.hinclude/kth/py-native/chain/transaction.hinclude/kth/py-native/config/blockchain_settings.hinclude/kth/py-native/config/database_settings.hscripts/build.shscripts/build_and_upload_pkg.shscripts/build_macOs.shscripts/copy_kth_from_conan_cache.shscripts/reset.shsetup.pysrc/chain/block.cppsrc/chain/header.cppsrc/chain/output_point.cppsrc/chain/payment_address.cppsrc/chain/script.cppsrc/chain/transaction.cppsrc/config/database_settings.cppsrc/module.csrc/node.cpptests/test_config.pytests/test_node.py
💤 Files with no reviewable changes (8)
- AUTHORS
- build_and_upload_pkg.bat
- init.py
- .travis/entrypoint_debian.sh
- .travis/entrypoint copy 2.sh
- .appveyor.yml
- include/kth/py-native/chain/transaction.h
- .travis/entrypoint.sh
30c2bb5 to
ef0fe46
Compare
There was a problem hiding this comment.
Actionable comments posted: 5
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
examples/tx_serialization.py (1)
36-37:⚠️ Potential issue | 🟡 MinorPython 2 syntax incompatible with Python 3.
Line 37 uses
.encode('hex')which is Python 2 syntax. In Python 3, bytes objects don't have anencodemethod that accepts'hex'. This will raiseAttributeError. The same issue exists on line 54.🐛 Proposed fix
- input_data_hex = input_data.encode('hex') + input_data_hex = input_data.hex()And similarly for line 54:
- output_data_hex = output_data.encode('hex') + output_data_hex = output_data.hex()🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@examples/tx_serialization.py` around lines 36 - 37, The code uses Python2 `.encode('hex')` on the bytes returned by bn.input_to_data (variables input_data and the similar one at line 54), which will raise in Python3; replace those calls with a Python3-compatible hex conversion such as using the bytes.hex() method or binascii.hexlify(...) and decode to str if needed (i.e., convert input_data to its hex string via input_data.hex() or binascii.hexlify(input_data).decode('ascii') wherever input_data_hex is assigned and for the analogous variable on line 54).
♻️ Duplicate comments (1)
setup.py (1)
176-178:⚠️ Potential issue | 🔴 CriticalGate the C++ flags by compiler/toolchain.
Lines 177-178 still bake Unix/GNU-style flags into every build. MSVC doesn't accept
-std=c++23or-lstdc++, so the Windows path remains broken.Does setuptools build_ext use MSVC for CPython extensions on Windows, and which flags are valid there for C++23 and the C++ standard library compared with GCC/Clang (`-std=c++23`, `/std:c++23preview`, `-lstdc++`)?🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@setup.py` around lines 176 - 178, The Unix/GNU C++ flags are being unconditionally applied in setup.py (extra_compile_args=['-std=c++23'], extra_link_args=['-lstdc++']); gate these by detecting the compiler/toolchain (e.g., inspect self.compiler.compiler_type inside a custom build_ext subclass or check sys.platform + compiler type) and set flags per toolchain: for MSVC use ['/std:c++23'] (no '-lstdc++'), for GCC/Clang use ['-std=c++23'] and link ['-lstdc++'] only when using those linkers; update the Extension configuration to use the conditionally-built extra_compile_args/extra_link_args (or override build_extensions in a BuildExt class) so Windows/MSVC builds no longer receive Unix/GNU flags.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/workflows/ci.yml:
- Around line 118-119: Update the GitHub Actions step named "🛞 Build wheels
with cibuildwheel" to use the newer cibuildwheel release by changing the `uses`
reference from `pypa/cibuildwheel@v3.2` to `pypa/cibuildwheel@v3.4.1`; locate
the step with the name "🛞 Build wheels with cibuildwheel" in the CI workflow
and replace the version tag accordingly to pin the action to v3.4.1.
In `@conanfile.py`:
- Around line 27-39: The generate method currently appends headers/libs into
inc_dst and lib_dst and can leave stale files; before iterating
dependencies.host you should remove or empty the staging directories (inc_dst
and lib_dst) and recreate them so staging is deterministic; implement a cleanup
step at the top of generate (before the for loop) that deletes contents of
self.source_folder/kth/include and self.source_folder/kth/lib (or the
directories themselves) and then ensures the directories exist, so subsequent
copy(...) calls (the copy calls using dep.cpp_info.includedirs and
dep.cpp_info.libdirs) stage only the current dependency payload.
In `@setup.py`:
- Around line 25-39: The build currently hard-codes KTH_CURRENCY_BCH while
run_conan_install(microarch, currency) and the Conan recipe accept other
currencies, causing mismatched artifacts; fix this by deriving the compile-time
define from the same currency parameter used by run_conan_install (e.g. compute
currency_macro = f"KTH_CURRENCY_{(currency or os.environ.get('KTH_CURRENCY') or
'BCH').upper()}" and use that instead of the literal KTH_CURRENCY_BCH in the
Extension's define list and any other places that add the currency define), and
update conanfile.py to expose and honor the currency option rather than pinning
BCH so the recipe and setup use the same currency selection (keep BCH as the
default when no currency param/env is provided).
In `@src/chain/payment_address.cpp`:
- Around line 21-31: The exposed Python native function was renamed to
kth_py_native_wallet_payment_address_encoded_legacy causing a breaking API
change for callers expecting the old name; to fix, add a compatibility wrapper
or alias named kth_py_native_wallet_payment_address_encoded that forwards to the
existing kth_py_native_wallet_payment_address_encoded_legacy (or directly calls
kth_wallet_payment_address_encoded_legacy) so existing Python imports continue
to work, and add a short deprecation comment and update release notes to
indicate the legacy name and planned removal timeline.
---
Outside diff comments:
In `@examples/tx_serialization.py`:
- Around line 36-37: The code uses Python2 `.encode('hex')` on the bytes
returned by bn.input_to_data (variables input_data and the similar one at line
54), which will raise in Python3; replace those calls with a Python3-compatible
hex conversion such as using the bytes.hex() method or binascii.hexlify(...) and
decode to str if needed (i.e., convert input_data to its hex string via
input_data.hex() or binascii.hexlify(input_data).decode('ascii') wherever
input_data_hex is assigned and for the analogous variable on line 54).
---
Duplicate comments:
In `@setup.py`:
- Around line 176-178: The Unix/GNU C++ flags are being unconditionally applied
in setup.py (extra_compile_args=['-std=c++23'], extra_link_args=['-lstdc++']);
gate these by detecting the compiler/toolchain (e.g., inspect
self.compiler.compiler_type inside a custom build_ext subclass or check
sys.platform + compiler type) and set flags per toolchain: for MSVC use
['/std:c++23'] (no '-lstdc++'), for GCC/Clang use ['-std=c++23'] and link
['-lstdc++'] only when using those linkers; update the Extension configuration
to use the conditionally-built extra_compile_args/extra_link_args (or override
build_extensions in a BuildExt class) so Windows/MSVC builds no longer receive
Unix/GNU flags.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 0daeac95-a8a2-4d98-87fe-4efa54adb19c
📒 Files selected for processing (34)
.appveyor.yml.github/workflows/ci.yml.github/workflows/knuth.yml.gitignore.travis/entrypoint copy 2.sh.travis/entrypoint.sh.travis/entrypoint_debian.shAUTHORS__init__.pybuild_and_upload_pkg.batconanfile.pyexamples/tx_serialization.pyinclude/kth/py-native/chain/payment_address.hinclude/kth/py-native/chain/script.hinclude/kth/py-native/chain/transaction.hinclude/kth/py-native/config/blockchain_settings.hinclude/kth/py-native/config/database_settings.hscripts/build.shscripts/build_and_upload_pkg.shscripts/build_macOs.shscripts/copy_kth_from_conan_cache.shscripts/reset.shsetup.pysrc/chain/block.cppsrc/chain/header.cppsrc/chain/output_point.cppsrc/chain/payment_address.cppsrc/chain/script.cppsrc/chain/transaction.cppsrc/config/database_settings.cppsrc/module.csrc/node.cpptests/test_config.pytests/test_node.py
💤 Files with no reviewable changes (9)
- AUTHORS
- init.py
- include/kth/py-native/chain/transaction.h
- build_and_upload_pkg.bat
- .github/workflows/knuth.yml
- .travis/entrypoint.sh
- .appveyor.yml
- .travis/entrypoint copy 2.sh
- .travis/entrypoint_debian.sh
✅ Files skipped from review due to trivial changes (5)
- src/config/database_settings.cpp
- tests/test_node.py
- src/node.cpp
- src/chain/output_point.cpp
- .gitignore
🚧 Files skipped from review as they are similar to previous changes (4)
- include/kth/py-native/chain/script.h
- include/kth/py-native/chain/payment_address.h
- include/kth/py-native/config/database_settings.h
- src/module.c
b4a2c11 to
cb10547
Compare
There was a problem hiding this comment.
Actionable comments posted: 4
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
src/module.c (1)
613-621:⚠️ Potential issue | 🟠 MajorList-building helpers don't propagate errors from element conversion.
config_checkpoints_to_py,config_authorities_to_py,config_endpoints_to_py, andconfig_strings_to_pycall their respective element converters without checking forNULLreturns. If an element conversion fails,PyList_SetItemreceivesNULL, which doesn't set an error but leaves a hole in the list.🛡️ Proposed fix for `config_checkpoints_to_py`
PyObject* config_checkpoints_to_py(kth_checkpoint* checkpoint, size_t n) { PyObject* pyArr = PyList_New(n); + if (pyArr == NULL) { + return NULL; + } for (size_t i = 0; i < n; ++i) { PyObject* elem = config_checkpoint_to_py(checkpoint); + if (elem == NULL) { + Py_DECREF(pyArr); + return NULL; + } PyList_SetItem(pyArr, i, elem); ++checkpoint; } return pyArr; }Apply the same pattern to
config_authorities_to_py(line 778),config_endpoints_to_py(line 798), andconfig_strings_to_py(line 808).🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/module.c` around lines 613 - 621, The list-builder helpers (config_checkpoints_to_py, config_authorities_to_py, config_endpoints_to_py, config_strings_to_py) call their element converters (e.g., config_checkpoint_to_py) but don’t check for NULL; if a converter fails you must avoid inserting NULL into the list, free the partially built list and propagate the error. Update each function so that after calling the element converter you check if elem == NULL; if so call Py_DECREF(pyArr) and return NULL; otherwise call PyList_SetItem(pyArr, i, elem) as before. Ensure you only call PyList_SetItem when elem is non-NULL so Python errors from the converter are correctly propagated.
🧹 Nitpick comments (4)
tests/test_config.py (1)
8-9: Parameterhashshadows Python builtin.Static analysis flagged that the function parameter
hashshadows the built-inhash()function. Consider renaming tohash_bytesordata.♻️ Suggested rename
-def hash_to_hex(hash): - return ''.join('{:02x}'.format(x) for x in hash[::-1]) +def hash_to_hex(hash_bytes): + return ''.join('{:02x}'.format(x) for x in hash_bytes[::-1])🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@tests/test_config.py` around lines 8 - 9, The function parameter named `hash` in `hash_to_hex` shadows the built-in `hash()`; rename the parameter (for example to `hash_bytes` or `data`) and update all uses inside `hash_to_hex` accordingly (the generator expression and reversal `hash[::-1]`) so the function keeps the same behavior but no longer shadows the builtin.conanfile.py (1)
42-43: Consider usingos.path.joinfor path construction.The f-string paths like
f"{self.source_folder}/kth"work butos.path.joinis the idiomatic cross-platform approach:♻️ Suggested change
- stage_root = f"{self.source_folder}/kth" + stage_root = os.path.join(self.source_folder, "kth") if os.path.isdir(stage_root): shutil.rmtree(stage_root) - inc_dst = f"{stage_root}/include" - lib_dst = f"{stage_root}/lib" + inc_dst = os.path.join(stage_root, "include") + lib_dst = os.path.join(stage_root, "lib")Also applies to: 46-47
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@conanfile.py` around lines 42 - 43, The path construction using f-strings (e.g., stage_root = f"{self.source_folder}/kth") is not cross-platform; replace these with os.path.join to build paths correctly. Update occurrences such as stage_root, and any other uses around the same block (e.g., where you reference self.source_folder with "/kth") to use os.path.join(self.source_folder, "kth") and similarly for the other paths referenced on lines 46-47 to ensure correct path separators on all OSes.setup.py (2)
24-44: Consider validatingmicroarchinput before passing to subprocess.While the
currencyparameter is validated later (lines 160-165), themicroarchparameter is passed directly to the subprocess without validation. Although the risk is limited since this is a build-time configuration and the command uses a list (not shell=True), validating against known march_id values would be more robust.The static analysis tool flagged
subprocess.check_call(S603), which is a general warning. In this context, the risk is acceptable since the inputs are build-time options, but input validation would eliminate the concern entirely.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@setup.py` around lines 24 - 44, The run_conan_install function currently forwards microarch directly into the Conan args before calling subprocess.check_call; validate microarch against an allowlist of known march_id values (e.g., a small set of supported strings) before appending '-o', f'kth/*:march_id={microarch}' to cmd, and if the value is invalid raise a clear ValueError (or ignore it) so only vetted values reach subprocess.check_call; keep the existing behavior for currency and ensure the validation logic references the microarch parameter and the cmd construction in run_conan_install.
221-222: macOSextra_link_argsoverride discards the extension's original link args.Line 222 replaces
extra_link_argsentirely rather than appending, which drops-lstdc++from line 217. While macOS useslibc++(notlibstdc++) so-lstdc++isn't needed, this replacement pattern could silently discard future link args added to the extension.Consider using append/extend for consistency:
♻️ Suggested change
if sys.platform == 'darwin': - extensions[0].extra_link_args = ['-stdlib=libc++', '-mmacosx-version-min=14'] + extensions[0].extra_link_args = ['-stdlib=libc++', '-mmacosx-version-min=14'] + # Note: -lstdc++ from the base definition is intentionally omitted + # on macOS since we link against libc++ instead.The current behavior is functionally correct for macOS since
libc++is the appropriate C++ standard library. Adding a comment would clarify the intent.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@setup.py` around lines 221 - 222, The code currently replaces extensions[0].extra_link_args for macOS which discards any previously set link args; change the assignment to append/extend the existing extra_link_args on the Extension object (extensions[0]) instead of overwriting it and add a brief comment noting we prefer libc++ on macOS so we add '-stdlib=libc++' and '-mmacosx-version-min=14' while preserving prior flags (e.g., keep any '-lstdc++' or future flags intact).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@examples/initchain.py`:
- Around line 28-31: The current signal_handler calls sys.exit(0) which bypasses
the finally block and prevents cleanup (node_signal_stop, node_destruct); change
signal_handler to set a module-level shutdown flag (e.g., shutdown_requested =
True) instead of calling sys.exit, remove the sys.exit call, and update main()’s
run loop to check that flag and break so execution flows to the existing finally
block; alternatively, if you prefer immediate cleanup, call node_signal_stop()
and node_destruct() directly from signal_handler, but ensure those functions are
safe to call from a signal context.
In `@kth_native.pyi`:
- Line 248: The stub for config_settings_get_from_file has the wrong return
type; update the function signature for config_settings_get_from_file to match
the C implementation by returning a 3-tuple (success, settings_or_none,
error_message_or_none) — e.g. change its annotation to return tuple[int,
Settings | None, str | None] (or an equivalent typing union) so callers see
(0/1, None or Settings, None or error message) as produced by the C code in
module.c.
In `@src/module.c`:
- Around line 294-297: The stub file kth_native.pyi currently declares
wallet_payment_address_encoded_cashaddr and wallet_payment_address_encoded_token
unconditionally while the C module only defines
kth_py_native_wallet_payment_address_encoded_cashaddr and
kth_py_native_wallet_payment_address_encoded_token under the KTH_CURRENCY_BCH
compile flag; update kth_native.pyi to avoid runtime AttributeError by making
those two declarations BCH-only—either wrap their stub declarations in a
typing.TYPE_CHECKING guard that is annotated/conditioned on a module-level
KTH_CURRENCY_BCH boolean (or a clear "# BCH-only" comment) so type-checkers see
them but runtime code won’t assume they exist, or explicitly mark them as
Optional/raise AttributeError in runtime docs; ensure you reference the exact
symbols wallet_payment_address_encoded_cashaddr,
wallet_payment_address_encoded_token and the native bindings
kth_py_native_wallet_payment_address_encoded_cashaddr/kth_py_native_wallet_payment_address_encoded_token
when making the change.
- Around line 57-60: The KTH_TRY_SETATTR macro doesn't guard against a NULL
value returned by PyObject_CallFunction; update each listed *_to_py function
(kth_py_native_config_node_settings_to_py,
kth_py_native_config_database_settings_to_py, config_checkpoint_to_py,
kth_py_native_config_blockchain_settings_to_py, config_authority_to_py,
config_endpoint_to_py, kth_py_native_config_network_settings_to_py,
kth_py_native_config_settings_to_py) to check the result of
PyObject_CallFunction immediately (e.g., assign to a PyObject* tmp =
PyObject_CallFunction(...); if (tmp == NULL) return NULL;) before using it in
any KTH_PY_SETATTR/KTH_TRY_SETATTR call so you never pass a NULL PyObject into
the setter (Python error is already set, so just return NULL).
---
Outside diff comments:
In `@src/module.c`:
- Around line 613-621: The list-builder helpers (config_checkpoints_to_py,
config_authorities_to_py, config_endpoints_to_py, config_strings_to_py) call
their element converters (e.g., config_checkpoint_to_py) but don’t check for
NULL; if a converter fails you must avoid inserting NULL into the list, free the
partially built list and propagate the error. Update each function so that after
calling the element converter you check if elem == NULL; if so call
Py_DECREF(pyArr) and return NULL; otherwise call PyList_SetItem(pyArr, i, elem)
as before. Ensure you only call PyList_SetItem when elem is non-NULL so Python
errors from the converter are correctly propagated.
---
Nitpick comments:
In `@conanfile.py`:
- Around line 42-43: The path construction using f-strings (e.g., stage_root =
f"{self.source_folder}/kth") is not cross-platform; replace these with
os.path.join to build paths correctly. Update occurrences such as stage_root,
and any other uses around the same block (e.g., where you reference
self.source_folder with "/kth") to use os.path.join(self.source_folder, "kth")
and similarly for the other paths referenced on lines 46-47 to ensure correct
path separators on all OSes.
In `@setup.py`:
- Around line 24-44: The run_conan_install function currently forwards microarch
directly into the Conan args before calling subprocess.check_call; validate
microarch against an allowlist of known march_id values (e.g., a small set of
supported strings) before appending '-o', f'kth/*:march_id={microarch}' to cmd,
and if the value is invalid raise a clear ValueError (or ignore it) so only
vetted values reach subprocess.check_call; keep the existing behavior for
currency and ensure the validation logic references the microarch parameter and
the cmd construction in run_conan_install.
- Around line 221-222: The code currently replaces extensions[0].extra_link_args
for macOS which discards any previously set link args; change the assignment to
append/extend the existing extra_link_args on the Extension object
(extensions[0]) instead of overwriting it and add a brief comment noting we
prefer libc++ on macOS so we add '-stdlib=libc++' and '-mmacosx-version-min=14'
while preserving prior flags (e.g., keep any '-lstdc++' or future flags intact).
In `@tests/test_config.py`:
- Around line 8-9: The function parameter named `hash` in `hash_to_hex` shadows
the built-in `hash()`; rename the parameter (for example to `hash_bytes` or
`data`) and update all uses inside `hash_to_hex` accordingly (the generator
expression and reversal `hash[::-1]`) so the function keeps the same behavior
but no longer shadows the builtin.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 1953d256-7b20-41db-b9c1-77070f45508f
📒 Files selected for processing (43)
.appveyor.yml.github/workflows/ci.yml.github/workflows/knuth.yml.gitignore.travis/entrypoint copy 2.sh.travis/entrypoint.sh.travis/entrypoint_debian.shAUTHORS__init__.pybuild_and_upload_pkg.batconanfile.pyexamples/config.pyexamples/initchain.pyexamples/settings.pyexamples/tx_serialization.pyexamples/wallet.pyinclude/kth/py-native/chain/payment_address.hinclude/kth/py-native/chain/script.hinclude/kth/py-native/chain/transaction.hinclude/kth/py-native/config/blockchain_settings.hinclude/kth/py-native/config/database_settings.hkth_native.pyipy.typedpyproject.tomlscripts/build.shscripts/build_and_upload_pkg.shscripts/build_macOs.shscripts/copy_kth_from_conan_cache.shscripts/reset.shsettings.pysetup.pysrc/chain/block.cppsrc/chain/header.cppsrc/chain/output_point.cppsrc/chain/payment_address.cppsrc/chain/script.cppsrc/chain/transaction.cppsrc/config/database_settings.cppsrc/module.csrc/node.cpptests/test_config.pytests/test_node.pyversion.py
💤 Files with no reviewable changes (11)
- AUTHORS
- version.py
- settings.py
- init.py
- .github/workflows/knuth.yml
- include/kth/py-native/chain/transaction.h
- build_and_upload_pkg.bat
- .travis/entrypoint_debian.sh
- .travis/entrypoint.sh
- .appveyor.yml
- .travis/entrypoint copy 2.sh
✅ Files skipped from review due to trivial changes (7)
- examples/config.py
- src/config/database_settings.cpp
- .gitignore
- src/node.cpp
- pyproject.toml
- include/kth/py-native/chain/payment_address.h
- .github/workflows/ci.yml
🚧 Files skipped from review as they are similar to previous changes (7)
- tests/test_node.py
- src/chain/output_point.cpp
- include/kth/py-native/chain/script.h
- include/kth/py-native/config/database_settings.h
- src/chain/transaction.cpp
- examples/tx_serialization.py
- include/kth/py-native/config/blockchain_settings.h
5753dd3 to
13ceaf8
Compare
There was a problem hiding this comment.
Actionable comments posted: 5
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
examples/tx_serialization.py (1)
12-44:⚠️ Potential issue | 🟠 MajorEnsure native transaction cleanup runs on all paths.
If any call after
bn.transaction_factory_from_data(...)raises,bn.transaction_destruct(tx)is skipped. Wrap the body intry/finallyso the native handle is always released.Proposed fix
def tx_inputs_outputs_size(tx_hex_str): tx_bytes = bytes(bytearray.fromhex(tx_hex_str)) tx = bn.transaction_factory_from_data(tx_bytes) - - inputs = bn.transaction_inputs(tx) - inputs_count = bn.input_list_count(inputs) - # print(inputs_count) - - outputs = bn.transaction_outputs(tx) - outputs_count = bn.output_list_count(outputs) - # print(outputs_count) - - # {"input_list_push_back", kth_py_native_input_list_push_back, METH_VARARGS, "..."}, - # {"input_list_count", kth_py_native_input_list_count, METH_VARARGS, "..."}, - # {"input_list_nth", kth_py_native_input_list_nth, METH_VARARGS, "..."}, - - - inputs_bytes_size = 0 - for i in range(inputs_count): - input = bn.input_list_nth(inputs, i) - input_data = bn.input_to_data(input, 1) - inputs_bytes_size += len(input_data) - - outputs_bytes_size = 0 - for i in range(outputs_count): - output = bn.output_list_nth(outputs, i) - output_data = bn.output_to_data(output, 1) - outputs_bytes_size += len(output_data) - - - bn.transaction_destruct(tx) - - return (inputs_bytes_size, outputs_bytes_size) + try: + inputs = bn.transaction_inputs(tx) + inputs_count = bn.input_list_count(inputs) + outputs = bn.transaction_outputs(tx) + outputs_count = bn.output_list_count(outputs) + + inputs_bytes_size = 0 + for i in range(inputs_count): + tx_input = bn.input_list_nth(inputs, i) + input_data = bn.input_to_data(tx_input, 1) + inputs_bytes_size += len(input_data) + + outputs_bytes_size = 0 + for i in range(outputs_count): + tx_output = bn.output_list_nth(outputs, i) + output_data = bn.output_to_data(tx_output, 1) + outputs_bytes_size += len(output_data) + + return (inputs_bytes_size, outputs_bytes_size) + finally: + bn.transaction_destruct(tx)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@examples/tx_serialization.py` around lines 12 - 44, The function tx_inputs_outputs_size must always call bn.transaction_destruct(tx) even if an exception occurs; wrap the logic after creating tx (the call to bn.transaction_factory_from_data) in a try/finally so bn.transaction_destruct(tx) runs in the finally block, leaving the existing calls that use tx (bn.transaction_inputs, bn.input_list_count, bn.transaction_outputs, bn.output_list_count, bn.input_list_nth, bn.input_to_data, bn.output_list_nth, bn.output_to_data) inside the try and returning the computed sizes before the finally returns cleanup.
♻️ Duplicate comments (2)
setup.py (2)
136-138:⚠️ Potential issue | 🟠 Major
build_extstill reuses stale staged libs when options change.Direct
python setup.py build_ext --microarch=...or--currency=...skips Conan as soon as any archive exists underkth/lib, so it can silently link the previous flavor. At minimum, rerun when either option is supplied; ideally keep a small stamp of the staged flavor.Minimal fix
- if not discover_static_libs(): + if self.microarch or self.currency or not discover_static_libs(): run_conan_install(self.microarch, self.currency)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@setup.py` around lines 136 - 138, The build_ext run method currently skips Conan if any archive exists (discover_static_libs()) and thus can reuse stale staged libs when build options change; modify run in class build_ext (the run method shown) to detect when --microarch or --currency options were provided and force re-run of run_conan_install(self.microarch, self.currency) in that case (or implement/check a small stamp file recording the last staged microarch/currency and compare it to self.microarch/self.currency and call run_conan_install when they differ), ensuring discover_static_libs() alone does not short-circuit installation when options changed.
58-60:⚠️ Potential issue | 🟠 Major
--currencystill doesn't drive the compile-timeKTH_CURRENCY_*macro.The custom commands accept
--currency, but_CURRENCYis frozen from the environment whensetup.pyis imported, before command-line options are parsed.python setup.py build_ext --currency=BTCcan therefore stage BTC libs and still compile the extension as BCH, which also changes the BCH-only exports insrc/module.c.Also applies to: 160-178
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@setup.py` around lines 58 - 60, The package reads the compile-time currency macro from the environment too early (when setup.py is imported) so the --currency option in user_options_extra has no effect; update the custom command handling so the currency value is taken from the parsed command option (implement/adjust initialize_options/finalize_options on the custom command that declares user_options_extra) and ensure that before build_ext runs you either set os.environ['_CURRENCY'] to that parsed value or, better, generate the Extension.define_macros/KTH_CURRENCY_* flags from the parsed option (e.g., in your custom build_ext subclass or in the command finalization where get_ext_kwargs/Extension objects are created) so KTH_CURRENCY_* macros reflect the --currency value rather than the original environment value.
🧹 Nitpick comments (3)
examples/tx_serialization.py (2)
13-13: Use direct hex decoding to avoid an extra allocation.
bytes.fromhex(tx_hex_str)is equivalent here and simpler thanbytes(bytearray.fromhex(...)).Proposed simplification
- tx_bytes = bytes(bytearray.fromhex(tx_hex_str)) + tx_bytes = bytes.fromhex(tx_hex_str)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@examples/tx_serialization.py` at line 13, Replace the extra allocation when decoding hex by using the direct classmethod: change the assignment that creates tx_bytes from bytes(bytearray.fromhex(tx_hex_str)) to use bytes.fromhex(tx_hex_str) instead, updating the expression that references tx_hex_str (the tx_bytes variable initialization in examples/tx_serialization.py) to avoid the unnecessary bytearray temporary.
24-27: Remove stale commented C-API mapping lines from the example.These comments look like internal binding table notes and distract from the example flow.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@examples/tx_serialization.py` around lines 24 - 27, Remove the stale commented C-API mapping lines that are distracting from the example: delete the three commented entries referencing "input_list_push_back", "input_list_count", and "input_list_nth" (and their corresponding kth_py_native_* names) so only relevant example code remains; look for those exact symbols in the file (input_list_push_back, kth_py_native_input_list_push_back, input_list_count, kth_py_native_input_list_count, input_list_nth, kth_py_native_input_list_nth) and remove the commented lines.tests/test_config.py (1)
125-128: Avoid pinning the first seed entry.You already relaxed the seed count to reduce churn, but asserting
seeds[0]reintroduces brittleness because upstream can reorder valid defaults without changing the config semantics. Check that the expected endpoint exists anywhere in the list instead.More stable assertion
- assert setts.network.seeds[0].scheme == '' - assert setts.network.seeds[0].host == 'seed.flowee.cash' - assert setts.network.seeds[0].port == 8333 + assert any( + seed.scheme == '' and seed.host == 'seed.flowee.cash' and seed.port == 8333 + for seed in setts.network.seeds + )🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@tests/test_config.py` around lines 125 - 128, The test pins the first seed entry which is brittle; instead, change the assertions on setts.network.seeds to verify that an entry with scheme=='' , host=='seed.flowee.cash' and port==8333 exists anywhere in the list (e.g. use any(...) over setts.network.seeds or a set membership check) rather than asserting setts.network.seeds[0]; keep the existing len(...) assertion but replace the three seeds[0] assertions with a single existence check referencing setts and setts.network.seeds.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@conanfile.py`:
- Around line 23-29: The configure() currently sets
self.options["kth/*"].currency which creates a final (non-overridable)
assignment; remove that assignment from the configure() method and instead add a
default_options mapping on the ConanFile class (e.g. default_options = {
"kth/*:currency": os.environ.get("KTH_CURRENCY", "BCH") }) so the
environment-provided default is used but can still be overridden by CLI/profile
(e.g. -o kth/*:currency=BTC or setup.py::run_conan_install()); leave other
option assignments (like shared and db) untouched.
In `@README.md`:
- Around line 126-143: The fenced code block in README.md beginning with the
opening triple backticks (```) needs a language identifier to satisfy
markdownlint MD040; update that opening fence to include a language (e.g.,
change ``` to ```text or ```bash) so the directory tree block is tagged, leaving
the block contents unchanged and keeping the closing triple backticks intact.
- Line 84: The sed invocation "sed -i 's/compiler.cppstd=.*/compiler.cppstd=23/'
\"$(conan profile path default)\"" is GNU-specific and fails on macOS; update
the README command to a portable form by either using BSD-compatible sed with an
explicit empty backup suffix (i.e., pass an empty suffix to -i) or replace the
sed call with a cross-platform alternative such as perl -pi -e (or another
POSIX-safe edit) so the profile line "compiler.cppstd=..." is reliably replaced
on both Linux and macOS.
In `@src/module.c`:
- Line 251: The Python stub for chain_script_construct_from_bytes is out of sync
with the native binding: update the declaration for
chain_script_construct_from_bytes in kth_native.pyi to accept the second
optional prefix parameter (matching the C parser signature "y#p") so the stub
reflects (data: bytes, prefix: Optional[int] = ...) or equivalent typing; ensure
the stub's return type matches the native function and adjust any callers/tests
to pass the prefix when required so runtime TypeError is avoided.
---
Outside diff comments:
In `@examples/tx_serialization.py`:
- Around line 12-44: The function tx_inputs_outputs_size must always call
bn.transaction_destruct(tx) even if an exception occurs; wrap the logic after
creating tx (the call to bn.transaction_factory_from_data) in a try/finally so
bn.transaction_destruct(tx) runs in the finally block, leaving the existing
calls that use tx (bn.transaction_inputs, bn.input_list_count,
bn.transaction_outputs, bn.output_list_count, bn.input_list_nth,
bn.input_to_data, bn.output_list_nth, bn.output_to_data) inside the try and
returning the computed sizes before the finally returns cleanup.
---
Duplicate comments:
In `@setup.py`:
- Around line 136-138: The build_ext run method currently skips Conan if any
archive exists (discover_static_libs()) and thus can reuse stale staged libs
when build options change; modify run in class build_ext (the run method shown)
to detect when --microarch or --currency options were provided and force re-run
of run_conan_install(self.microarch, self.currency) in that case (or
implement/check a small stamp file recording the last staged microarch/currency
and compare it to self.microarch/self.currency and call run_conan_install when
they differ), ensuring discover_static_libs() alone does not short-circuit
installation when options changed.
- Around line 58-60: The package reads the compile-time currency macro from the
environment too early (when setup.py is imported) so the --currency option in
user_options_extra has no effect; update the custom command handling so the
currency value is taken from the parsed command option (implement/adjust
initialize_options/finalize_options on the custom command that declares
user_options_extra) and ensure that before build_ext runs you either set
os.environ['_CURRENCY'] to that parsed value or, better, generate the
Extension.define_macros/KTH_CURRENCY_* flags from the parsed option (e.g., in
your custom build_ext subclass or in the command finalization where
get_ext_kwargs/Extension objects are created) so KTH_CURRENCY_* macros reflect
the --currency value rather than the original environment value.
---
Nitpick comments:
In `@examples/tx_serialization.py`:
- Line 13: Replace the extra allocation when decoding hex by using the direct
classmethod: change the assignment that creates tx_bytes from
bytes(bytearray.fromhex(tx_hex_str)) to use bytes.fromhex(tx_hex_str) instead,
updating the expression that references tx_hex_str (the tx_bytes variable
initialization in examples/tx_serialization.py) to avoid the unnecessary
bytearray temporary.
- Around line 24-27: Remove the stale commented C-API mapping lines that are
distracting from the example: delete the three commented entries referencing
"input_list_push_back", "input_list_count", and "input_list_nth" (and their
corresponding kth_py_native_* names) so only relevant example code remains; look
for those exact symbols in the file (input_list_push_back,
kth_py_native_input_list_push_back, input_list_count,
kth_py_native_input_list_count, input_list_nth, kth_py_native_input_list_nth)
and remove the commented lines.
In `@tests/test_config.py`:
- Around line 125-128: The test pins the first seed entry which is brittle;
instead, change the assertions on setts.network.seeds to verify that an entry
with scheme=='' , host=='seed.flowee.cash' and port==8333 exists anywhere in the
list (e.g. use any(...) over setts.network.seeds or a set membership check)
rather than asserting setts.network.seeds[0]; keep the existing len(...)
assertion but replace the three seeds[0] assertions with a single existence
check referencing setts and setts.network.seeds.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 3ed51a7d-3d6d-4963-b2ae-0475ddd9f055
📒 Files selected for processing (44)
.appveyor.yml.github/workflows/ci.yml.github/workflows/knuth.yml.gitignore.travis/entrypoint copy 2.sh.travis/entrypoint.sh.travis/entrypoint_debian.shAUTHORSREADME.md__init__.pybuild_and_upload_pkg.batconanfile.pyexamples/config.pyexamples/initchain.pyexamples/settings.pyexamples/tx_serialization.pyexamples/wallet.pyinclude/kth/py-native/chain/payment_address.hinclude/kth/py-native/chain/script.hinclude/kth/py-native/chain/transaction.hinclude/kth/py-native/config/blockchain_settings.hinclude/kth/py-native/config/database_settings.hkth_native.pyipy.typedpyproject.tomlscripts/build.shscripts/build_and_upload_pkg.shscripts/build_macOs.shscripts/copy_kth_from_conan_cache.shscripts/reset.shsettings.pysetup.pysrc/chain/block.cppsrc/chain/header.cppsrc/chain/output_point.cppsrc/chain/payment_address.cppsrc/chain/script.cppsrc/chain/transaction.cppsrc/config/database_settings.cppsrc/module.csrc/node.cpptests/test_config.pytests/test_node.pyversion.py
💤 Files with no reviewable changes (11)
- settings.py
- init.py
- version.py
- AUTHORS
- build_and_upload_pkg.bat
- include/kth/py-native/chain/transaction.h
- .github/workflows/knuth.yml
- .travis/entrypoint.sh
- .travis/entrypoint_debian.sh
- .travis/entrypoint copy 2.sh
- .appveyor.yml
✅ Files skipped from review due to trivial changes (8)
- examples/config.py
- tests/test_node.py
- src/node.cpp
- src/config/database_settings.cpp
- .gitignore
- examples/settings.py
- include/kth/py-native/chain/payment_address.h
- pyproject.toml
🚧 Files skipped from review as they are similar to previous changes (9)
- include/kth/py-native/chain/script.h
- examples/wallet.py
- include/kth/py-native/config/database_settings.h
- src/chain/block.cpp
- examples/initchain.py
- src/chain/payment_address.cpp
- .github/workflows/ci.yml
- include/kth/py-native/config/blockchain_settings.h
- kth_native.pyi
13ceaf8 to
ca6242c
Compare
There was a problem hiding this comment.
Actionable comments posted: 1
♻️ Duplicate comments (4)
setup.py (2)
137-139:⚠️ Potential issue | 🟠 Major
build_ext --microarch/--currencycan reuse stale staged libs.Line 137 only checks whether any libs exist, not whether they match current options. This can silently link the wrong flavor.
🛠️ Minimal fix
- if not discover_static_libs(): + if self.microarch or self.currency or not discover_static_libs(): run_conan_install(self.microarch, self.currency)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@setup.py` around lines 137 - 139, discover_static_libs() is only checked for existence, not for matching self.microarch/self.currency, causing stale libs to be reused; change the sequence in setup.py to call discover_static_libs() into a local variable (libs), validate that those libs correspond to the current self.microarch and self.currency (e.g., by inspecting lib filenames/metadata returned by discover_static_libs()), and if they don't match, call run_conan_install(self.microarch, self.currency) and then re-run discover_static_libs() to refresh libs before proceeding.
24-41:⚠️ Potential issue | 🟠 MajorConan currency and compile-time
KTH_CURRENCY_*macro can diverge.
run_conan_install()honors CLI--currency, butdefine_macrosuses_CURRENCYfrom env at import time. A CLI override can build BTC/LTC libs while compiling BCH wrappers.🛠️ Keep macro and Conan option in lockstep
class BuildExtCommand(build_ext): @@ def run(self): + resolved_currency = (self.currency or os.environ.get('KTH_CURRENCY') or _CURRENCY).upper() + for ext in self.extensions: + ext.define_macros = [ + (k, v) for (k, v) in ext.define_macros + if not k.startswith("KTH_CURRENCY_") + ] + ext.define_macros.append((f"KTH_CURRENCY_{resolved_currency}", None)) - if not discover_static_libs(): - run_conan_install(self.microarch, self.currency) + if self.microarch or self.currency or not discover_static_libs(): + run_conan_install(self.microarch, resolved_currency)Also applies to: 160-179
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@setup.py` around lines 24 - 41, run_conan_install() builds with a currency selected from the function argument or env var, but the C compile-time macro (the define_macros path that reads KTH_CURRENCY at import time) can still pick a different value; make the currency decision a single source of truth and propagate it to macro generation: compute the effective currency once (using the same precedence logic as run_conan_install: function arg overrides env) and either set os.environ['KTH_CURRENCY'] to that value before any define_macros are created or modify the macro-generation routine to accept a currency parameter and use that parameter when producing the KTH_CURRENCY_* define; update references in run_conan_install and the macro/extension setup so both use the same effective currency variable.src/module.c (2)
643-643:⚠️ Potential issue | 🟠 MajorGuard and DECREF nested
*_to_py()results before assigning attributes.Patterns like
PyObject_SetAttrString(obj, "node", kth_py_native_config_node_settings_to_py(...))leak references on success and are unsafe if the nested converter returnsNULL.🛠️ Example fix (Line 927 pattern)
- KTH_TRY_SETATTR(obj, PyObject_SetAttrString(obj, "node", kth_py_native_config_node_settings_to_py(&setts->node))); + PyObject* node_obj = kth_py_native_config_node_settings_to_py(&setts->node); + if (node_obj == NULL) { Py_DECREF(obj); return NULL; } + KTH_TRY_SETATTR(obj, PyObject_SetAttrString(obj, "node", node_obj)); + Py_DECREF(node_obj);Also applies to: 855-859, 866-866, 869-869, 927-930
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/module.c` at line 643, The call is creating a temporary PyObject* from config_checkpoints_to_py and passing it directly to PyObject_SetAttrString via KTH_TRY_SETATTR which leaks on success and fails to handle NULL; change to store the converter result in a local PyObject* (e.g., tmp = config_checkpoints_to_py(...)), check for NULL and handle the error path, call PyObject_SetAttrString(obj, "checkpoints", tmp), and then Py_DECREF(tmp) after a successful set; apply the same pattern to other nested converters such as kth_py_native_config_node_settings_to_py and any other *_to_py() uses referenced in the comment (lines noted: the node settings and similar converters at the other occurrences) and use KTH_TRY or the existing error handling macros consistently when checking for NULL.
57-60:⚠️ Potential issue | 🟠 Major
KTH_TRY_SETATTRimproves error flow, butKTH_PY_SETATTRstill leaks Python objects.Every
KTH_PY_SETATTRcall builds a temporary Python object and hands it toPyObject_SetAttrString; that API does not steal references, so these temporaries leak on success.🛠️ Proposed fix pattern
+static int kth_setattr_owned(PyObject* obj, const char* name, PyObject* value) { + if (value == NULL) { + return -1; // Python error already set by creator + } + int rc = PyObject_SetAttrString(obj, name, value); + Py_DECREF(value); + return rc; +} -#define KTH_PY_SETATTR(to, from, var, fmt) PyObject_SetAttrString(to, `#var`, Py_BuildValue(fmt, from->var)) +#define KTH_PY_SETATTR(to, from, var, fmt) kth_setattr_owned((to), `#var`, Py_BuildValue((fmt), (from)->var))#!/bin/bash rg -n "KTH_PY_SETATTR\\(|PyObject_SetAttrString\\(" src/module.cAlso applies to: 376-381, 438-443, 636-672, 834-870
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/module.c` around lines 57 - 60, KTH_PY_SETATTR currently creates a temporary PyObject and calls PyObject_SetAttrString which does not steal the reference, so the temporary leaks on success; change the macro (or call sites using KTH_PY_SETATTR) to always Py_DECREF the temporary after calling PyObject_SetAttrString: create the temp, call PyObject_SetAttrString, if the call returns <0 then Py_DECREF(temp) and return NULL (matching KTH_TRY_SETATTR flow), otherwise Py_DECREF(temp) and continue; update occurrences that use KTH_PY_SETATTR (referenced by the macro name and calls to PyObject_SetAttrString) so temporaries are DECREF'd on both success and failure.
🧹 Nitpick comments (1)
examples/tx_serialization.py (1)
13-14: Simplify hex decoding by usingbytes.fromhex()directly.The intermediate
bytearrayconversion is unnecessary since Python 3 providesbytes.fromhex().♻️ Suggested simplification
- tx_bytes = bytes(bytearray.fromhex(tx_hex_str)) + tx_bytes = bytes.fromhex(tx_hex_str)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@examples/tx_serialization.py` around lines 13 - 14, The code unnecessarily creates a bytearray when decoding hex; replace the intermediate conversion so tx_bytes is produced using bytes.fromhex(tx_hex_str) and then passed to bn.transaction_factory_from_data(tx_bytes) (i.e., remove the bytearray.fromhex(...) step and construct tx_bytes directly with bytes.fromhex to simplify and clarify the tx_bytes creation used by tx and bn.transaction_factory_from_data).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@tests/test_config.py`:
- Line 8: Rename the parameter in function hash_to_hex from shadowing built-in
"hash" to "hash_bytes" and update all internal references to use "hash_bytes";
then replace all explicit equality comparisons to True/False in this file (e.g.,
occurrences around the test assertions mentioned) with idiomatic truthiness
checks (e.g., "if x:" or "if not x:") so Ruff A002 and E712 violations are
resolved, ensuring any conditional logic that previously used "== True" or "==
False" now uses plain truthiness or negation.
---
Duplicate comments:
In `@setup.py`:
- Around line 137-139: discover_static_libs() is only checked for existence, not
for matching self.microarch/self.currency, causing stale libs to be reused;
change the sequence in setup.py to call discover_static_libs() into a local
variable (libs), validate that those libs correspond to the current
self.microarch and self.currency (e.g., by inspecting lib filenames/metadata
returned by discover_static_libs()), and if they don't match, call
run_conan_install(self.microarch, self.currency) and then re-run
discover_static_libs() to refresh libs before proceeding.
- Around line 24-41: run_conan_install() builds with a currency selected from
the function argument or env var, but the C compile-time macro (the
define_macros path that reads KTH_CURRENCY at import time) can still pick a
different value; make the currency decision a single source of truth and
propagate it to macro generation: compute the effective currency once (using the
same precedence logic as run_conan_install: function arg overrides env) and
either set os.environ['KTH_CURRENCY'] to that value before any define_macros are
created or modify the macro-generation routine to accept a currency parameter
and use that parameter when producing the KTH_CURRENCY_* define; update
references in run_conan_install and the macro/extension setup so both use the
same effective currency variable.
In `@src/module.c`:
- Line 643: The call is creating a temporary PyObject* from
config_checkpoints_to_py and passing it directly to PyObject_SetAttrString via
KTH_TRY_SETATTR which leaks on success and fails to handle NULL; change to store
the converter result in a local PyObject* (e.g., tmp =
config_checkpoints_to_py(...)), check for NULL and handle the error path, call
PyObject_SetAttrString(obj, "checkpoints", tmp), and then Py_DECREF(tmp) after a
successful set; apply the same pattern to other nested converters such as
kth_py_native_config_node_settings_to_py and any other *_to_py() uses referenced
in the comment (lines noted: the node settings and similar converters at the
other occurrences) and use KTH_TRY or the existing error handling macros
consistently when checking for NULL.
- Around line 57-60: KTH_PY_SETATTR currently creates a temporary PyObject and
calls PyObject_SetAttrString which does not steal the reference, so the
temporary leaks on success; change the macro (or call sites using
KTH_PY_SETATTR) to always Py_DECREF the temporary after calling
PyObject_SetAttrString: create the temp, call PyObject_SetAttrString, if the
call returns <0 then Py_DECREF(temp) and return NULL (matching KTH_TRY_SETATTR
flow), otherwise Py_DECREF(temp) and continue; update occurrences that use
KTH_PY_SETATTR (referenced by the macro name and calls to
PyObject_SetAttrString) so temporaries are DECREF'd on both success and failure.
---
Nitpick comments:
In `@examples/tx_serialization.py`:
- Around line 13-14: The code unnecessarily creates a bytearray when decoding
hex; replace the intermediate conversion so tx_bytes is produced using
bytes.fromhex(tx_hex_str) and then passed to
bn.transaction_factory_from_data(tx_bytes) (i.e., remove the
bytearray.fromhex(...) step and construct tx_bytes directly with bytes.fromhex
to simplify and clarify the tx_bytes creation used by tx and
bn.transaction_factory_from_data).
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 057059a2-7800-494e-a6d7-3fe58e7c46c7
📒 Files selected for processing (44)
.appveyor.yml.github/workflows/ci.yml.github/workflows/knuth.yml.gitignore.travis/entrypoint copy 2.sh.travis/entrypoint.sh.travis/entrypoint_debian.shAUTHORSREADME.md__init__.pybuild_and_upload_pkg.batconanfile.pyexamples/config.pyexamples/initchain.pyexamples/settings.pyexamples/tx_serialization.pyexamples/wallet.pyinclude/kth/py-native/chain/payment_address.hinclude/kth/py-native/chain/script.hinclude/kth/py-native/chain/transaction.hinclude/kth/py-native/config/blockchain_settings.hinclude/kth/py-native/config/database_settings.hkth_native.pyipy.typedpyproject.tomlscripts/build.shscripts/build_and_upload_pkg.shscripts/build_macOs.shscripts/copy_kth_from_conan_cache.shscripts/reset.shsettings.pysetup.pysrc/chain/block.cppsrc/chain/header.cppsrc/chain/output_point.cppsrc/chain/payment_address.cppsrc/chain/script.cppsrc/chain/transaction.cppsrc/config/database_settings.cppsrc/module.csrc/node.cpptests/test_config.pytests/test_node.pyversion.py
💤 Files with no reviewable changes (11)
- AUTHORS
- init.py
- version.py
- settings.py
- build_and_upload_pkg.bat
- include/kth/py-native/chain/transaction.h
- .travis/entrypoint_debian.sh
- .appveyor.yml
- .travis/entrypoint copy 2.sh
- .travis/entrypoint.sh
- .github/workflows/knuth.yml
✅ Files skipped from review due to trivial changes (7)
- src/config/database_settings.cpp
- tests/test_node.py
- examples/config.py
- src/node.cpp
- .gitignore
- include/kth/py-native/config/blockchain_settings.h
- examples/initchain.py
🚧 Files skipped from review as they are similar to previous changes (8)
- src/chain/output_point.cpp
- examples/wallet.py
- examples/settings.py
- src/chain/header.cpp
- include/kth/py-native/config/database_settings.h
- src/chain/payment_address.cpp
- pyproject.toml
- kth_native.pyi
The standalone c-api/<ver>@kth/stable recipe was rolled into a single unified `kth` package; this branch updates py-native to consume it, modernizes the build pipeline, brings the C bindings up to date with the C-API drift accumulated over the last ~4 years, and replaces the deprecated CI workflow with a cibuildwheel-based one. Build pipeline: - conanfile.py: Conan 2 syntax (requirements()/generate()), pins kth/0.78.0 with currency=BCH and db=dynamic, stages headers and static libs from kth and ALL transitive deps into ./kth/. - setup.py: drop dead Conan 1.x Conan.factory() path; shell out to `conan install` instead. BuildExtCommand discovers staged libs at build time and wraps them in -Wl,--start-group/--end-group so the linker resolves circular deps regardless of glob order. C++23, explicit -lstdc++. run_conan_install() also reads KTH_MARCH_ID and KTH_CURRENCY from env (CLI flags still win) so CI can override. - Move .sh helpers to scripts/, drop .bat, .travis/, .appveyor.yml, AUTHORS, residual __init__.py. C-API drift fixes (Python signatures changed accordingly — py-api will need matching updates): - block/header/transaction factory_from_data: drop `version` arg - block/header serialized_size, header_to_data: drop `version` arg - output_point_construct_from_hash_index: takes kth_hash_t const* now - chain_script_construct -> chain_script_construct_from_bytes - wallet_payment_address_encoded split into _legacy / _cashaddr (with token_aware bool) / _token (the BCH-only ones gated by KTH_CURRENCY_BCH) - transaction_hash_sighash_type removed (no longer in C-API) - node_get_chain -> node_chain - node.cpp: explicitly include kth/capi/node_info.h (no longer pulled in transitively by kth/capi.h) Settings struct sync: - DatabaseSettings: drop flush_writes/file_growth_rate/ index_start_height; add db_mode (kth_db_mode_t enum: 0=pruned, 1=normal, 2=full_indexed). - BlockchainSettings: drop gauss_activation_time/ descartes_activation_time; add bch_gauss/bch_descartes/ bch_lobachevski/bch_galois bools and the new leibniz_activation_time (2026) / cantor_activation_time (2027). module.c quality pass: - Replace 8 functions full of `int res2 = ...; res2 = ...; res2 = ...` (set-but-never-checked) with KTH_TRY_SETATTR macro that Py_DECREFs the half-built object and returns NULL on the first failure, propagating the Python exception correctly. The macro name is provisional; flagged for the cleanup pass. Tests: - test_config.py rewritten with shared `_assert_common_*` helpers (mainnet and testnet4 share ~80 asserts), named constants (DB_MODE_NORMAL, LEIBNIZ_ACTIVATION_TIME, CANTOR_ACTIVATION_TIME), loops over BIP/BCH flag families instead of 30 copy-pasted lines, and `>=` bounds instead of exact-equality on things that grow organically (checkpoint count, seed count). Covers the new db_mode and bch_* fields, drops the removed gauss/descartes ones. - test_node.py: rename node_get_chain -> node_chain. All 3 tests pass against a real Knuth v0.78.0 node startup on mainnet (32+32 threads, march_id=ZLm9Pjh). CI (replaces the old knuth.yml): - The old workflow was a Frankenstein of npm/node-pre-gyp/twine/ Docker-custom across three shells, pinned to ubuntu-20.04 (deprecated by GitHub) and node-pre-gyp paths that don't even apply to a Python package. Replaced wholesale. - New ci.yml: sdist job + cibuildwheel matrix on Linux x86_64 (manylinux container) and macOS arm64 (macos-latest is Apple Silicon now). cp39-cp313, no PyPy, no musllinux, no 32-bit. - Conan installed in CIBW_BEFORE_ALL, kth remote registered, and CIBW_BEFORE_BUILD pre-installs the build deps so we can run pip with --no-build-isolation (our setup.py shells out to `conan install`). - Smoke test inside each built wheel: `import kth_native` + pytest. - publish job scaffolded for Trusted Publishing on tag pushes, gated behind `if: false` until PyPI auth is wired up. - KTH_MARCH_ID/KTH_CURRENCY pinned at workflow level so the build flavor lives in exactly one place. - Concurrency group cancels superseded PR runs to save minutes. - Windows is intentionally out of this pass — the old workflow's Windows path was 100% npm/node-gyp/node-pre-gyp and never built a Python wheel in the first place. Add `windows-latest` later as a follow-up once Linux/macOS is green.
ca6242c to
333ace3
Compare
Summary
c-api/<ver>@kth/stableConan 1 recipe to the unifiedkth/0.78.0Conan 2 package.--start-grouplink, dynamic lib discovery, no moreConan.factory()).DatabaseSettingsandBlockchainSettingsPython structs with the current Ckth_*_settingslayouts.*_to_pyfunctions inmodule.cto actually propagate Python errors (was set-but-never-checked).tests/test_config.pywith shared helpers, named constants, and tolerant bounds.Build pipeline
conanfile.py→ Conan 2 (requirements()/generate()), pinnedkth/0.78.0,currency=BCH,db=dynamic. Stages headers and static libs fromkthand all transitive deps into./kth/.setup.py→ drops dead Conan 1 path, shells out toconan install.BuildExtCommanddiscovers staged libs at build time and wraps them in-Wl,--start-group/-Wl,--end-groupso the linker resolves circular deps. C++23, explicit-lstdc++..shhelpers moved toscripts/; dropped.bat,.travis/,.appveyor.yml,AUTHORS, residual root__init__.py.C-API drift fixes
py-api will need matching updates when we get to it:
block_factory_from_data(version, data)(data)block_serialized_size(block, version)(block)header_to_data(header, version)(header)header_factory_from_data(version, data)(data)transaction_factory_from_data(version, data)(data)transaction_hash_sighash_typechain_script_constructchain_script_construct_from_byteswallet_payment_address_encoded_legacy,_cashaddr(token_aware),_token(BCH-only ones gated)node_get_chainnode_chainoutput_point_construct_from_hash_indexkth_hash_tby valuekth_hash_t const*node.cppnow explicitly includeskth/capi/node_info.h(no longer pulled in transitively bykth/capi.h).Settings struct sync
DatabaseSettings: dropflush_writes/file_growth_rate/index_start_height; adddb_mode(kth_db_mode_tenum:0=pruned,1=normal,2=full_indexed).BlockchainSettings: dropgauss_activation_time/descartes_activation_time; addbch_gauss/bch_descartes/bch_lobachevski/bch_galoisbools and the newleibniz_activation_time(2026) /cantor_activation_time(2027).module.cquality passThe 8
*_to_pybuilders used to look like:This silently swallowed every
SetAttrfailure and returned a half-built object. Replaced withKTH_TRY_SETATTR(obj, expr)which on failurePy_DECREFsobjand returnsNULL, properly propagating the Python exception. The macro name is provisional — flagged for the cleanup pass.Tests
test_config.pyrewritten with_assert_common_chain/database/network/node_defaultshelpers (mainnet and testnet4 share ~80 asserts), named constants (DB_MODE_NORMAL,LEIBNIZ_ACTIVATION_TIME,CANTOR_ACTIVATION_TIME), loops over BIP/BCH flag families instead of 30 copy-pasted lines, and>=bounds instead of exact-equality on things that grow organically (checkpoint count, seed count). Covers the newdb_modeandbch_*fields, drops the removedgauss/descartesones.test_node.py: renamenode_get_chain→node_chain.Test plan
pip install --no-build-isolation -e .(succeeds)import kth_native(206 symbols exposed)pytest tests/→ 3 passedtest_node.pyactually launches a Knuth v0.78.0 node on mainnet (32+32 threads,march_id=ZLm9Pjh), reads chain height, signals stop, destructs cleanlycibuildwheel)Summary by CodeRabbit
New Features
Changes
Tests
Chores