Greatest Common Divider

The Greatest Common Divisor (GCD) function is useful for simplifying fractions with rational numbers. Specifically in machine learning, GCD can be used in algorithms for computing tensor factorizations and matrix decompositions. GCD(a, b) is the largest positive integer that divides both a and b without a remainder.

To implement the GCD function, see the method signature below. Note, a is an unsigned integer, thus non-negative numbers are required.

function GCD(uint256 a, uint256 b) external returns (uint256)

Under the hood, the following functions are utilized directly in the Neural client.

func (con *NeuralMath) GCD(gas *big.Int, a, b *big.Int) (*big.Int, error, *big.Int) {
	var gcd big.Int
	return gcd.GCD(nil, nil, a, b), nil, nil
}

Last updated