-
Talinn, Estonia
-
-
support@lnsolutions.ee

In the first part of this series (or here for non subscribers), we built a Bitcoin trading bot that listens to trading signals (for example from a Discord channel) and executes trades on LN Markets. The core pieces were:
In this second part, we’ll evolve that architecture into something more intelligent: a bot that keeps your existing trading strategy logic, but delegates parts of your strategy to a Python-based machine learning (ML) agent.
This is a pattern you can reuse:
In a typical first version of a trading bot, some parameters are hard-coded or stored in a config file:
This is easy to implement but has clear limitations:
The architecture we’ll discuss keeps your discretionary or rule-based strategy intact, but adds a feedback loop:
The result is a bot that is still your strategy, but with a data-informed parameter layer wrapping it.
Let’s zoom out. The evolved system has four main components:
Parses incoming signals.
Decides whether a new trade should be opened or an existing one closed.
Calls LN Markets via ln-markets/api.
Persists trade metadata into trades.db.
trades.db)Stores each trade along with metadata such as indicator, timeframe, trade type, and price information.
Accumulates realized statistics about how trades behave once they are opened.
Python parameter ML agent
Runs periodically (e.g., hourly) in its own virtual environment.
Loads historical trades from trades.db.
Engineers features and trains an ML model.
Writes out a JSON file (e.g., params.json) containing per‑regime parameters.
The Node bot reloads this JSON and uses it as an overlay on top of your existing config.
Conceptually, the pipeline looks like this:
Signals → Node Bot → LN Markets & trades.db
trades.db → Python ML Agent →params.json→ Node Bot (param overlay)
Before bringing ML into the picture, your bot needs to produce the raw material the agent will learn from.
In the Node.js side you already have:
futuresNewTrade).trades.db (ID, timestamps, prices, indicator, timeframe, trade type).To support ML-driven parameter management, add or ensure:
15m, 1h, 4h).Fetches open trades from LN Markets.
Tracks the min and max price each trade sees while it is open.
Persists these “extreme” prices back into the database as simple summary statistics.
This is enough for a Python agent to reconstruct:
The Python agent‘s responsibilities are:
trades.db
def main():
"""Main execution function."""
print("=" * 60)
print("AI Trading Bot Parameter Generator")
print("=" * 60)
# Resolve paths
db_path = Path(__file__).parent / DB_PATH
output_path = Path(__file__).parent / OUTPUT_PATH
if not db_path.exists():
print(f"ERROR: Database not found at {db_path}")
return 1
# Ensure output directory exists
output_path.parent.mkdir(parents=True, exist_ok=True)
# Load data
df = load_trade_data(str(db_path))
2. Engineer features: Typical features include:
Indicator.
Timeframe.
Trade type (long vs short).
Regime flags (e.g., a boolean for trending vs ranging).
Distance to a moving average at entry.
Whether the entry was near a recent local high/low.
Simple time-of-day or day-of-week information.
def engineer_features(df):
"""Engineer features for ML model."""
# Create derived features
df['ma_distance'] = np.where(
df['movingAverage'].notna() & (df['entryprice'] != 0),
(df['entryprice'] - df['movingAverage']) / df['entryprice'],
0
)
df['local_bottom_distance'] = np.where(
df['localBottom'].notna() & (df['entryprice'] != 0),
(df['entryprice'] - df['localBottom']) / df['entryprice'],
0
)
df['local_top_distance'] = np.where(
df['localTop'].notna() & (df['entryprice'] != 0),
(df['localTop'] - df['entryprice']) / df['entryprice'],
0
)
print(f"Engineered features for {len(df)} trades")
return df
3. Train an ML model: The agent can use a tree-based regressor such as XGBoost (or any library in your stack) to learn a mapping from:
(indicator, timeframe, regime, trade type, features)def train_model(df):
"""Train XGBoost model """
print("Training XGBoost model...")
# Define target and features
target = 'sl_pct'
exclude_cols = [
target,
'entryprice',
'stoploss',
'takeprofit',
'profit',
'movingAverage',
'localBottom',
'localTop',
'features',
]
feature_cols = [col for col in df.columns if col not in exclude_cols]
X = df[feature_cols]
y = df[target]
# Split data
X_train, X_test, y_train, y_test = train_test_split(
X, y, test_size=0.2, random_state=42
)
# Train model
model = XGBRegressor(
n_estimators=100,
max_depth=5,
learning_rate=0.1,
random_state=42,
objective='reg:squarederror'
)
model.fit(X_train, y_train)
# Evaluate
y_pred = model.predict(X_test)
mae = mean_absolute_error(y_test, y_pred)
rmse = np.sqrt(mean_squared_error(y_test, y_pred))
print(f"Model trained successfully!")
print(f" MAE: {mae:.4f}%")
print(f" RMSE: {rmse:.4f}%")
print(f" Feature importance (top 5):")
feature_importance = sorted(
zip(feature_cols, model.feature_importances_),
key=lambda x: x[1],
reverse=True
)[:5]
for feat, importance in feature_importance:
print(f" {feat}: {importance:.4f}")
return model, feature_cols
4. Aggregate by regime and write JSON: Rather than storing one prediction per trade, the agent:
(indicator, timeframe, regime flag, trade type).The output is written to something like:
analysis/params.jsonStructured as a nested object keyed by:
Each leaf contains statistics and recommended parameters.
Again, this is your design: you decide what “parameter” means; the pattern here is only about how to export and consume it.
On the Node.js side, you integrate the Python agent as a background job.
stdout / stderr.This keeps the ML agent separate from your trading loop. The trading bot only cares that a JSON file appears when training succeeds.
2. Schedule periodic retraining: Using a scheduler like node-cron, you can:
Reload the params.json file into memory.
With params.json successfully loaded, the Node.js bot can expose a couple of helper functions that act as a thin translation layer between:
Typical patterns:
Then:
params.json.This ensures backwards compatibility: you can always turn off the ML overlay or let it “fill in” only for regimes with enough data.
Whenever you plug ML into a live trading system, you need guardrails. Some important patterns:
These patterns are generic and apply to any trading strategy, not just the one in this bot.
In Part 1 we focused on connecting a Node.js bot to LN Markets and executing trades from external signals. In this Part 2, we extended that architecture with a Python ML agent that learns from your historical trades and feeds the bot with regime-aware parameters via a simple JSON interface.
The critical design choices were:
trades.db as the shared source of truth.From here, you can extend the same pattern to:
All of this leverages machine learning and Python AI agents to make your trading bot more adaptive and data-driven.
The XRechnung LNSolutions addon extends Alfresco with full support for German XRechnung electronic invoices. It automatically detects XRechnung XML documents, enriches them with metadata, and generates high-quality HTML and PDF renditions via a dedicated Transform Engine (T‑Engine).
The solution consists of three main components:
xr:xrechnungMetadata, and integration with the T‑Engine. WEB-INF/lib or custom Docker image)To request a trial license, use:
Request XRechnung trial license
lnsolutions-alfresco-license-1.0.0.jar to:
<TOMCAT>/webapps/alfresco/WEB-INF/lib/ (non-Docker)xrechnung-lnsolutions-platform-1.0-SNAPSHOT.jar to:
<TOMCAT>/webapps/alfresco/WEB-INF/lib/ (non-Docker)lnsolutions-alfresco-license).In alfresco-global.properties (or equivalent Docker environment), add the XRechnung T‑Engine URL:
# URL of the XRechnung Transform Engine
localTransform.xrechnung.url=http://xrechnung-tengine:8091
Adjust the host/port to your environment (e.g. http://localhost:8091 for standalone testing).
docker run -d -p 8091:8091 --name xrechnung-tengine xrechnung-tengine:latest
curl http://localhost:8091/ready
java -jar xrechnung-tengine-1.0-SNAPSHOT.jar
curl http://localhost:8091/ready
Configure the following environment variables for the T‑Engine (e.g. in a .env file or service configuration):
XRECHNUNG_TRANSFORM_CONFIG=/opt/xrechnung-tengine/config/xrechnung-transform-config.json
CORE_AIO_TRANSFORM_URL=http://transform-core-aio:8090/transform
LICENSE_VALIDATION_URL=http://<alfresco-host>:8080/alfresco/service/lnsolutions/admin/license-status
The T‑Engine calls LICENSE_VALIDATION_URL before each transform to enforce your license via Alfresco.
xr:xrechnungMetadata).Digital Invoice Profile, Digital Invoice Profile Description (Short), and Is Digital Invoice.http://<alfresco-host>:8080/alfresco/service/lnsolutions/node/xrechnung-html?id={nodeId}

The addon also supports a PDF pipeline: XRechnung XML → HTML (T‑Engine) → PDF (Alfresco pdfRenderer).

The XRechnung addon uses the XRechnung Metadata aspect (xr:xrechnungMetadata) as the technical indicator that a document is a valid XRechnung invoice.
xr:isDigitalInvoice): Boolean flag that is set when a document is detected as XRechnung.xr:digitalInvoiceProfile): Stores the detected digital invoice profile identifier.xr:digitalInvoiceProfileDescShort): Short, human‑readable label for the detected profile (e.g. displayed as “Digital Invoice Profile Description (Short)” in Alfresco).The XRechnungDetectionBehavior listens to document creation and updates for cm:content nodes. When an uploaded XML file matches the XRechnung patterns (based on its content, metadata, or filename), the behavior automatically adds the xr:xrechnungMetadata aspect and sets xr:isDigitalInvoice = true.
Only documents that carry this XRechnung indicator (the xr:xrechnungMetadata aspect with xr:isDigitalInvoice=true) are treated as XRechnung by the HTML/PDF transformation pipeline. Generic XML files remain unaffected until they are positively identified as XRechnung.

If you want to evaluate the XRechnung addon before purchasing a full license, request a trial here:
Request XRechnung Trial License
Or contact us for a permanent license.
Read more on Medium
Git is a powerful version control system widely used by developers to manage source code repositories. One of the key features of Git is its support for branching, allowing developers to work on new features or experiments without affecting the main codebase. However, managing branches and resolving merge conflicts effectively are crucial skills for successful collaboration and code management. In this article, we’ll explore common Git branching tasks and how to handle merge conflicts gracefully for feature branches.
Git branches are independent lines of development within a Git repository. They allow developers to work on different features, bug fixes, or experiments without affecting the main codebase. Here are some essential concepts related to Git branches:
git checkout -b <branch-name> command. This creates a new branch and switches to it in one step.git checkout <branch-name> command. This allows you to work on different features or bug fixes seamlessly.git branch command. The branch you are currently on will be highlighted.git branch -d <branch-name> command. Be cautious when deleting branches, especially if they contain unmerged changes.Merge conflicts occur when Git cannot automatically merge changes from different branches due to conflicting changes in the same part of a file. Here’s how to handle merge conflicts effectively:
git pull origin <branch-name>.git merge <branch-name> command. Git will attempt to automatically merge the changes. If there are conflicts, Git will pause the merge process.<<<<<<<, =======, >>>>>>>) and keep the changes you want to retain.git add . and commit the merge using git commit. Git will create a merge commit to finalize the merge process.
In this example we start with main and then create a new local branch and switch to it
Read more on Medium
Read on Medium
Discord has become a cornerstone platform for communication among gamers, communities, and teams worldwide. The extensibility of Discord through bots has unlocked a plethora of possibilities, enabling automation, moderation, and enhanced user experiences. In this guide, we’ll delve into Discord bot development, using LN Markets as trading platform.
Trading signals Discord channels and groups are online communities where traders come together to share valuable insights, trading signals, and analyses pertaining to the crypto market. These channels and groups are hosted on the Discord messaging platform, enabling seamless real-time communication and information exchange among members.
LN Markets is a platform that enables users to trade Bitcoin futures contracts using the Lightning Network, a second-layer solution for faster and cheaper Bitcoin transactions. LN Markets leverages the Lightning Network’s micro-payment capabilities to provide instant settlement and low transaction fees for trading activities.
Overall, LN Markets provides a unique trading experience for Bitcoin enthusiasts and traders looking to participate in the cryptocurrency markets using the Lightning Network’s innovative technology.
// Import necessary modules
const Discord = require('discord.js');
const client = new Discord.Client();
// Define bot behavior
client.on('message', message => {
if (message.content === '!hello') {
message.reply('Hello, world!');
}
});
// Log in to Discord with your bot token
client.login('YOUR_DISCORD_BOT_TOKEN');
Here we will just focus on reading the last message from the trading channel
// Register event for when client receives a message.
client.on('message', message => {
console.log(message.content);
processSignal(message);
});
We need to install the ln-markets api for our project via this command:
npm install @ln-markets/api
The next code snippet will open a connection to the LN Markets API.
import { createRestClient } from '@ln-markets/api';
const key = config.lnmarketsKey.trim();
const secret = config.lnmarketsSecret.trim();
const passphrase = config.lnmarketsPassphrase.trim();
const network = 'testnet';
const lnclient = createRestClient({ key, secret, passphrase });
Now we parse the text from the Discord message and trade accordingly. In this example the signal didn’t come in plain text, but it used some rich text features from Discord (message.embeds), so these fields had to be parsed and not only the message.text field.
async function processSignal(message) {
if (message.embeds.length > 0 && message.embeds[0].title.includes('Buy Signal!')) {
// Extract trade information
let entryPrice = null;
let stopLoss = null;
let takeProfit = null;
// Split message content by lines
const createdTimestamp = message.createdTimestamp;
const lines = message.embeds[0].fields;
lines.forEach((line, index) => {
if (line.name.includes('Entry Price')) {
const priceLine = line.value.replace(/`/g, '');
entryPrice = parseFloat(priceLine);
} else if (line.name.includes('Stop Loss')) {
const priceLine = line.value.replace(/`/g, '');
stopLoss = Math.round(parseFloat(priceLine)) + 0.5;
} else if (!line.value.includes('There will be a separate signal to take profit') && line.name.includes('Take Profit')) {
const priceLine = line.value.replace(/`/g, '');
takeProfit = Math.round(parseFloat(priceLine)) + 0.5;
} else if (line.value.includes('There will be a separate signal to take profit that could be higher than this price.')) {
const regex = /```(\d+(\.\d+)?)```/;
// Extract the numeric value using the match method and regular expression
const match = line.value.match(regex);
if (match) {
// Extract the numeric value from the matched string
const priceLine = parseFloat(match[1]);
takeProfit = Math.round(parseFloat(priceLine)) + 0.5;
}
}
});
if (entryPrice !== null && stopLoss !== null) {
console.log(`Entry Price: ${entryPrice}`);
console.log(`Stop Loss: ${stopLoss}`);
if (takeProfit !== null) {
console.log(`Take Profit: ${takeProfit}`);
}
await getBuyTrade(createdTimestamp, takeProfit, stopLoss, entryPrice);
}
} else if (message.embeds.length > 0 && message.embeds[0].title.includes('Sell Signal!')) {
let sellPrice = null;
const createdTimestamp = message.createdTimestamp;
const lines = message.embeds[0].fields;
lines.forEach((line, index) => {
if (line.value.includes('position has been closed at:')) {
const regex = /```(\d+(\.\d+)?)```/;
// Extract the numeric value using the match method and regular expression
const match = line.value.match(regex);
if (match) {
// Extract the numeric value from the matched string
const priceLine = parseFloat(match[1]);
sellPrice = parseFloat(priceLine);
}
}
});
if (sellPrice !== null) {
console.log(`Sell Price: ${sellPrice}`);
await getTradeByTimestamp(createdTimestamp, (err, rows) => {
if (err) {
console.error(err.message);
} else {
// Check if any rows were returned
if (rows.length > 0) {
// Assuming id is the first column in the result set
const tradeId = rows[0].id;
console.log('Trade ID in Sell:', tradeId);
} else if (lastBuyTradeId != null) { // This is a new sell trade
// Perform Sell Trade in LNMarkets
const data = {
id: lastBuyTradeId,
};
console.log(data);
const trade = lnclient.futuresCloseTrade(lastBuyTradeId).then((response) => {
saveToDatabase(response.id, createdTimestamp, 0.0, 0.0, sellPrice, response.pl, 'Sell');
})
.catch((err) => {
// Handle error
console.error(err.message);
});
}
}
});
}
}
}
async function getBuyTrade(createdTimestamp, takeProfit, stopLoss, entryPrice) {
try {
// Call getTradeByTimestamp and wait for it to complete
const rows = await new Promise((resolve, reject) => {
getTradeByTimestamp(createdTimestamp, (err, rows) => {
if (err) {
reject(err);
} else {
resolve(rows);
}
});
});
// Check if any rows were returned
if (rows.length > 0) {
const tradeId = rows[0].id;
lastBuyTradeId = tradeId;
console.log('Trade ID in Buy:', tradeId);
} else {
// Perform Trade in LNMarkets
let data = {
"type": 'm',
"side": 'b',
"leverage": LEVERAGE,
"quantity": QUANTITY,
"takeprofit": takeProfit,
"stoploss": stopLoss
};
const trade = await lnclient.futuresNewTrade(data);
saveToDatabase(trade.id, createdTimestamp, entryPrice, stopLoss, takeProfit, 0.0, 'Buy');
}
} catch (error) {
console.error(error.message);
}
In the previous code snippet, the function getBuyTrade uses getTradeByTimestamp, which queries a local sqlite database file to query if this trade was already done. The function saveToDatabase saves the trade to this database when the trade is new. This in order to execute each trade only once.
function saveToDatabase(lnMarketsId, createdTimestamp, entryPrice, stopLoss, takeProfit, profit, tradeType) {
lastBuyTradeId = lnMarketsId;
createTrade(lnMarketsId, createdTimestamp, entryPrice, stopLoss, takeProfit, profit, tradeType, err => {
if (err) {
console.error(err.message);
} else {
console.log(tradeType + " Trade created successfully!");
}
});
}
Finally we add some cleanup code for our database connection
// Clean up database connection when the application exits
process.on('exit', () => {
closeConnection();
console.log('Database connection closed');
});
// Handle Ctrl+C (SIGINT) signal
process.on('SIGINT', () => {
closeConnection();
console.log('Database connection closed');
process.exit();
});
In this guide we saw an introduction on how to program a Bitcoin trading bot on LN Markets, based on trading signals sent to a specific Discord channel. By mastering the concepts and techniques outlined in this guide, you’ll be well-equipped to build bots that take signals on specific trading servers and automate your trading
Für Deutsch hier
Read on Medium

In the dynamic world of cryptocurrency, where volatility and innovation intersect, the regulatory landscape is continuously evolving. One notable aspect of this evolution is the taxation of cryptocurrency gains. While many countries have implemented various tax regimes for cryptocurrencies, some offer favorable policies that exempt long-term holders from certain tax liabilities. One such policy gaining traction in several jurisdictions is the tax exemption for crypto holdings held for one year or more.
The one-year holding policy, also known as long-term capital gains treatment, is a taxation framework that grants exemptions or reduced tax rates on profits generated from the sale or exchange of cryptocurrencies held for a minimum duration of one year. This policy aims to incentivize long-term investment in cryptocurrencies while encouraging stability in the market.
Several countries have embraced the concept of tax-free crypto gains for long-term holders. Some of these nations include:
The ccGains (cryptocurrency gains) package provides a python library for calculating capital gains made by trading cryptocurrencies or foreign currencies. It was created by Jürgen Probst and it is hosted in Github (link here). Some of its features are:
You’ll need Python (ccGains is tested under Python 2.7 and Python 3.x). Get it here: https://www.python.org/
ccGains can then easily be installed via the Python package manager pip:
git clone https://github.com/probstj/ccgains.gitpip install . (note the . at the end)pip install --user .pip install -e .Please have a look at examples/example.py and follow the comments to adapt it for your purposes.
Copy it to a new file for example taxReport2023.py.
Hourly data for a lot of exchanges is available for download at:
https://api.bitcoincharts.com/v1/csv/
To understand which file to download, consult this list:
https://bitcoincharts.com/markets/list/
For EUR prices on Kraken, download: https://api.bitcoincharts.com/v1/csv/krakenEUR.csv.gz and place it in the ../data folder.
For CHF prices, download: https://api.bitcoincharts.com/v1/csv/anxhkCHF.csv.gz
For SGD prices, download: https://api.bitcoincharts.com/v1/csv/anxhkSGD.csv.gz
The file consists of three comma-separated columns: the unix timestamp, the price, and the volume (amount traded).
Create the HistoricData object by loading the mentioned file and
specifying the price unit, i.e. fiat/btc:
h1 = ccgains.HistoricDataCSV(
'../data/bitcoin_de_EUR_abridged_as_example.csv.gz', 'EUR/BTC')
For all coins that you possessed at some point, their historical price in your native fiat currency must be known, which can also be derived from their BTC price and the BTC/fiat price given above (or even from their price in any other alt-coin, whose price can be derived, in turn.)
This data can be provided from any website that serves this data through an API, or from a csv-file, like above. Note that currently, only the API from Poloniex.com is implemented.
Create HistoricData objects to fetch rates from Poloniex.com: (it is important to mention at least all traded coins here)
h2 = ccgains.HistoricDataAPI('data', 'btc/xmr')
h3 = ccgains.HistoricDataAPI('data', 'btc/eth')
h4 = ccgains.HistoricDataAPI('data', 'btc/usdt')
h5 = ccgains.HistoricDataAPI('data', 'btc/link')
h6 = ccgains.HistoricDataAPI('data', 'btc/bat')
h7 = ccgains.HistoricDataAPI('data', 'btc/zrx')
h8 = ccgains.HistoricDataAPI('data', 'btc/cvc')
h9 = ccgains.HistoricDataAPI('data', 'btc/dash')
h10 = ccgains.HistoricDataAPI('data', 'btc/knc')
h11 = ccgains.HistoricDataAPI('data', 'btc/mkr')
h12 = ccgains.HistoricDataAPI('data', 'btc/matic')
h13 = ccgains.HistoricDataAPI('data', 'btc/doge')
h14 = ccgains.HistoricDataAPI('data', 'btc/bch')
h15 = ccgains.HistoricDataAPI('data', 'btc/dot')
h16 = ccgains.HistoricDataAPI('data', 'btc/qtum')
h17 = ccgains.HistoricDataAPI('data', 'btc/ren')
h18 = ccgains.HistoricDataAPI('data', 'btc/str')
h19 = ccgains.HistoricDataAPI('data', 'btc/xtz')
h20 = ccgains.HistoricDataAPI('data', 'btc/trx')
h21 = ccgains.HistoricDataAPI('data', 'btc/zec')
h22 = ccgains.HistoricDataAPI('data', 'btc/ltc')
h23 = ccgains.HistoricDataAPI('data', 'btc/xrp')
h24 = ccgains.HistoricDataAPI('data', 'btc/omg')
h25 = ccgains.HistoricDataAPI('data', 'btc/etc')
h26 = ccgains.HistoricDataAPI('data', 'btc/dot')
h27 = ccgains.HistoricDataAPI('data', 'btc/dai')
h28 = ccgains.HistoricDataAPI('data', 'btc/usdc')
h29 = ccgains.HistoricDataAPICoinbase('data', 'cro/eur')
h30 = ccgains.HistoricDataAPIBinance('data', 'btc/uni')
h31 = ccgains.HistoricDataAPIBinance('data', 'avax/eur')
h32 = ccgains.HistoricDataAPIBinance('data', 'btc/dydx')
h33 = ccgains.HistoricDataAPIBinance('data', 'btc/iota')
h34 = ccgains.HistoricDataAPIBinance('data', 'btc/axs')
In h2 to h28 I used the class HistoricDataAPI which uses the public Poloniex API: https://poloniex.com/public?command=returnTradeHistory, since this is the exchange that has these pairs traded in the year 2023 in this example.
In h29 I used the class HistoricDataAPICoinbase which uses the public Coinbase API: ‘https://api.pro.coinbase.com/products/:SYMBOL:/candles’.
In h30 to h34 I used the class HistoricDataAPIBinance which will transparently fetch data on request (get_price) from the public Binance API: https://api.binance.com/api/v1/klines
For faster loading times on future calls, a HDF5 file is created from the requested data and used transparently the next time a request for the same day and pair is made. These HDF5 files are saved in cache_folder. The unit must be a string in the form ‘currency_one/currency_two’, e.g. ‘NEO/BTC’. The data will be resampled by calculating the weighted price for interval steps specified by interval. See: http://pandas.pydata.org/pandas-docs/stable/timeseries.html#offset-aliases for possible values.
prepare_request(dtime) Return a pandas DataFrame which contains the data for the requested datetime dtime.
Create a CurrencyRelation object that puts all provided HistoricData currencies in relation in order to serve exchange rates for any pair of these currencies:
rel = ccgains.CurrencyRelation(h1, h2, h3, h4, h5, h6, h7, h8, h9, h10, h11, h12, h13, h14, h15, h16, h17, h18, h19, h20, h21, h22, h23, h24, h25, h26,h27, h28, h29, h30, h31, h32, h33, h34)
Create the BagQueue object that calculates taxable profit from trades using the first-in/first-out method:
(this needs to know your native fiat currency and the CurrencyRelation created above)
bf = ccgains.BagQueue('EUR', rel)
The TradeHistory object provides methods to load your trades from csv-files exported from various exchanges or apps.
th = ccgains.TradeHistory()
Export your trades from exchanges or apps as comma-separated files and append them to the list of trades managed by the TradeHistory object. All trades will be sorted automatically.
To load from a supported exchange, use the methods named `append_<exchange_name>_csv` found in TradeHistory (see trades.py).
th.append_poloniex_csv(
'./data/2020/poloniex_depositHistory_2023.csv',
'deposits')
th.append_poloniex_csv(
'./data/2020/poloniex_tradeHistory_2023.csv',
'trades',
condense_trades=True)
th.append_poloniex_csv(
'./data/2020/poloniex_withdrawalHistory_2023.csv',
'withdr')
th.append_binance_csv(
'./data/2020/binance_depositHistory_2023.csv',
'deposits')
th.append_binance_csv(
'./data/2020/binance_tradeHistory_2023.csv',
'deposits')
th.append_binance_csv(
'./data/2020/binance_withdrawalHistory_2023.csv',
'deposits')
th.append_wirex_csv(
'./data/2020/wirex_btc_tradeHistory_2023.csv',
'trades')
th.append_cro_csv(
'./data/2021/cro_tradeHistory_2023.csv',
'trades')
If your exchange is not supported yet, add a new method in trades.py. In this example the methods append_wirex_csv and append_cro_csv are not supported in the original GitHub project.
Next I will show how I did it with append_cro_csv (Crypto.com). First import TPLOC_CRO_TRADES, which is the library that knows how to read the CSV file from the Crypto.com exchange.
from .cro_util import (
TPLOC_CRO_TRADES)
def append_cro_csv(
self, file_name, which_data='trades', delimiter=',',
skiprows=1, default_timezone=tz.tzutc()):
wdata = which_data[:5].lower()
if wdata not in ['trade']:
raise ValueError(
'`which_data` must be one of "trades"')
plocs = TPLOC_CRO_TRADES
self.append_csv(
file_name=file_name,
param_locs=plocs,
delimiter=delimiter,
skiprows=skiprows,
default_timezone=default_timezone
)
Now create the cro_util.py file
from decimal import Decimal
def kind_for(csv_line):
if 'Withdraw' in (csv_line[1].strip('" \n\t')) or ('Transfer' in (csv_line[1].strip('" \n\t')) and 'App' in csv_line[1].strip('" \n\t').split("->")[0]):
return 'Withdrawal'
elif 'Deposit' in (csv_line[1].strip('" \n\t')) or 'Reward' in (csv_line[1].strip('" \n\t')) or ('Transfer' in (csv_line[1].strip('" \n\t')) and 'Exchange' in csv_line[1].strip('" \n\t').split("->")[0]):
return 'Deposit'
elif 'Buy' in (csv_line[1].strip('" \n\t')):
return 'Buy'
elif '->' in csv_line[1].strip('" \n\t'):
return 'Sell'
else:
return None
def get_buy_currency(csv_line):
if not '->' in csv_line[1].strip('" \n\t') and (kind_for(csv_line) == 'Buy' or kind_for(csv_line) == 'Deposit' or 'App' in csv_line[1].strip('" \n\t').split("->")[0]):
return csv_line[2].strip('" \n\t')
elif '->' in csv_line[1].strip('" \n\t') and (kind_for(csv_line) == 'Buy' or kind_for(csv_line) == 'Deposit' or 'App' in csv_line[1].strip('" \n\t').split("->")[0]):
return csv_line[2].strip('" \n\t')
else:
return csv_line[1].strip('" \n\t').split("->")[1]
def get_sell_currency(csv_line):
if not '->' in csv_line[1].strip('" \n\t') and (kind_for(csv_line) == 'Sell' or kind_for(csv_line) == 'Withdrawal' or 'Exchange' in csv_line[1].strip('" \n\t').split("->")[0]):
return csv_line[2].strip('" \n\t')
elif '->' in csv_line[1].strip('" \n\t') and (kind_for(csv_line) == 'Sell' or kind_for(csv_line) == 'Withdrawal' or 'Exchange' in csv_line[1].strip('" \n\t').split("->")[0]):
return csv_line[2].strip('" \n\t')
else:
return csv_line[1].strip('" \n\t').split("->")[0]
#Trade parameters in csv from Crypto.com
TPLOC_CRO_TRADES = {
'kind': lambda cols: kind_for(cols),
'dtime': 0,
'buy_currency': lambda cols: get_buy_currency(cols) if kind_for(cols) == 'Buy' or kind_for(cols) == 'Deposit' else cols[4] if cols[4].strip('" \n\t') != '' else '',
'buy_amount': lambda cols: abs(Decimal(cols[3])) if kind_for(cols) == 'Buy' or kind_for(cols) == 'Deposit' else abs(Decimal(cols[5])) if cols[5].strip('" \n\t') != '' else '',
'sell_currency': lambda cols: get_sell_currency(cols) if kind_for(cols) == 'Sell' or kind_for(cols) == 'Withdrawal' else 'EUR',
'sell_amount': lambda cols: abs(Decimal(cols[3])) if kind_for(cols) == 'Sell' or kind_for(cols) == 'Withdrawal' else cols[7],
'fee_currency': -1,
'fee_amount': -1,
'exchange': 'Crypto.com', 'mark': -1,
'comment': lambda cols: cols[1]
}
Some exchanges, like Poloniex, does not include withdrawal fees in their exported csv files. This will try to add these missing fees by comparing withdrawn amounts with amounts deposited on other exchanges shortly after withdrawal. Call this only after all transactions from every involved exchange and wallet were imported.
This uses a really simple algorithm, so it is not guaranteed to work in every case, especially if you made withdrawals in tight succession on different exchanges, so please check the output.
th.add_missing_transaction_fees(raise_on_error=False)
Some currencies have changed ticker symbols since their first listing date (e.g., AntShares (ANS) -> Neo (NEO)). This can lead to situations where all historical pricing data lists the new ticker symbol, but transaction history still lists the old ticker.
This method allows for renaming symbols in the TradeHistory, if any occurrences of the old name/ticker are found.
th.update_ticker_names({'ANS': 'NEO'})
You can export all imported trades for future reference into a single file, optionally filtered by year.
…either as a comma-separated text file (can be imported into ccgains):
th.export_to_csv('transactions2023.csv', year=2023)
…or as html or pdf file, with the possibility to filter or rename column headers or contents:
(This is an example for a translation into German)
my_column_names=[
'Art', 'Datum', 'Kaufmenge', 'Verkaufsmenge', u'Gebühren', u'Börse']
transdct = {'Buy': 'Anschaffung',
'BUY': 'Anschaffung',
'Sell': 'Tausch',
'SELL': 'Tausch',
'Purchase': 'Anschaffung',
'Exchange': 'Tausch', 'Disbursement': 'Abhebung',
'Deposit': 'Einzahlung',
'Withdrawal': 'Abhebung',
'Received funds': 'Einzahlung',
'Withdrawn from wallet': 'Abhebung',
'Create offer fee: a5ed7482': u'Börsengebühr',
'Buy BTC' : 'Anschaffung',
'MultiSig deposit: a5ed7482': 'Abhebung',
'MultiSig payout: a5ed7482' : 'Einzahlung'}
th.export_to_pdf('Transactions2021.pdf',
year=2021, drop_columns=['mark', 'comment'],
font_size=12,
caption=u"Handel mit digitalen Währungen %(year)s",
intro=u"<h4>Auflistung aller Transaktionen zwischen "
"%(fromdate)s und %(todate)s:</h4>",
locale="de_DE",
custom_column_names=my_column_names,
custom_formatters={
'Art': lambda x: transdct[x] if x in transdct else x})
If the calculation run for previous years already, we can load the state of the bags here, no need to calculate everything again:
bf.load('./status2022.json')
Or, if the current calculation crashed (e.g. you forgot to add a traded currency in #2 above), the file ‘precrash.json’ will be created automatically. Load it here to continue:
bf.load('./precrash.json')
The following just looks where to start calculating trades, in case you already calculated some and restarted by loading ‘precrash.json’:
last_trade = 0
while (last_trade < len(th.tlist)
and th[last_trade].dtime <= bf._last_date):
last_trade += 1
if last_trade > 0:
logger.info("continuing with trade #%i" % (last_trade + 1))
# Now, the calculation. This goes through your imported list of trades:
for i, trade in enumerate(th.tlist[last_trade:]):
# Most of this is just the log output to the console and to the
# file 'ccgains_<date-time>.log'
# (check out this file for all gory calculation details!):
logger.info('TRADE #%i', i + last_trade + 1)
logger.info(trade)
# This is the important part:
bf.process_trade(trade)
# more logging:
log_bags(bf)
logger.info("Totals: %s", str(bf.totals))
logger.info("Gains (in %s): %s\n" % (bf.currency, str(bf.profit)))
bf.save('status2023.json')
The default column names used in the report don’t look very nice: [‘kind’, ‘bag_spent’, ‘currency’, ‘bag_date’, ‘sell_date’, ‘exchange’, ‘short_term’, ‘spent_cost’, ‘proceeds’, ‘profit’], so we rename them:
my_column_names=[
'Type', 'Amount spent', u'Currency', 'Purchase date',
'Sell date', u'Exchange', u'Short term', 'Purchase cost',
'Proceeds', 'Profit']
Here we create the report pdf for capital gains in 2023.
The date_precision=’D’ means we only mention the day of the trade, not the precise time. We also set combine=True, so multiple trades made on the same day and on the same exchange are combined into a single trade on the report:
my_column_names=[
'Art', 'Verkaufsmenge', u'Währung', 'Erwerbsdatum',
'Verkaufsdatum', u'Börse', u'in\xa0Besitz',
'Anschaffungskosten', u'Verkaufserlös', 'Gewinn']
transdct = {'sale': u'Veräußerung',
'withdrawal fee': u'Börsengebühr',
'deposit fee': u'Börsengebühr',
'exchange fee': u'Börsengebühr'}
convert_short_term=[u'>\xa01\xa0Jahr', u'<\xa01\xa0Jahr']
bf.report.export_report_to_pdf(
'Report2021_de.pdf', year=2023,
date_precision='D', combine=True,
custom_column_names=my_column_names,
custom_formatters={
u'in\xa0Besitz': lambda b: convert_short_term[b],
'Art': lambda x: transdct[x]},
locale="de_DE",
template_file='shortreport_de.html'
)
# If you rather want your report in a spreadsheet, you can export
# to csv:
bf.report.export_short_report_to_csv(
'report_2023.csv', year=2023,
date_precision='D', combine=False,
convert_timezone=True, strip_timezone=True)
The simple capital gains report created above is just a plain listing of all trades and the gains made, enough for the tax report.
A more detailed listing outlining the calculation is also available:
bf.report.export_extended_report_to_pdf(
'Details_2023.pdf', year=2023,
date_precision='S', combine=False,
font_size=10, locale="en_US")
And again, let’s translate this report to German: (Using transdct from above again to translate the payment kind)
bf.report.export_extended_report_to_pdf(
'Details_2023_de.pdf', year=2023,
date_precision='S', combine=False,
font_size=10, locale="de_DE",
template_file='fullreport_de.html',
payment_kind_translation=transdct)
Now run
python taxReport2023.py
This should generate the pdf files Details_2023.pdf and Details_2023_de.pdf, which are the reports needed by the tax authorities.
For more information on crypto tax report generation or customization to your needs, contact us at lnsolutions.ee
github.com