Updated README

This commit is contained in:
Anup Kumar Panwar 2019-07-06 11:11:20 +05:30
parent 831558d38d
commit 4e413c0183
45 changed files with 404 additions and 702 deletions

View File

@ -72,9 +72,9 @@ We want your work to be readable by others; therefore, we encourage you to note
- Write tests to illustrate your work.
The following "testing" approaches are not encouraged:
The following "testing" approaches are **not** encouraged:
```python
```python*
input('Enter your input:')
# Or even worse...
input = eval(raw_input("Enter your input: "))
@ -97,13 +97,9 @@ We want your work to be readable by others; therefore, we encourage you to note
#### Other Standard While Submitting Your Work
- File extension for code should be `.py`.
- File extension for code should be `.py`. Jupiter notebook files are acceptable in machine learning algorithms.
- Please file your work to let others use it in the future. Here are the examples that are acceptable:
- Camel cases
- `-` Hyphenated names
- `_` Underscore-separated names
- Strictly use snake case (underscore separated) in your file name, as it will be easy to parse in future using scripts.
If possible, follow the standard *within* the folder you are submitting to.

50
DIRECTORY.py Normal file
View File

@ -0,0 +1,50 @@
import os
def getListOfFiles(dirName):
# create a list of file and sub directories
# names in the given directory
listOfFile = os.listdir(dirName)
allFiles = list()
# Iterate over all the entries
for entry in listOfFile:
# if entry == listOfFile[len(listOfFile)-1]:
# continue
if entry=='.git':
continue
# Create full path
fullPath = os.path.join(dirName, entry)
entryName = entry.split('_')
# print(entryName)
ffname = ''
try:
for word in entryName:
temp = word[0].upper() + word[1:]
ffname = ffname + ' ' + temp
# print(temp)
final_fn = ffname.replace('.py', '')
final_fn = final_fn.strip()
print('* ['+final_fn+']('+fullPath+')')
# pass
except:
pass
# If entry is a directory then get the list of files in this directory
if os.path.isdir(fullPath):
print ('\n## '+entry)
filesInCurrDir = getListOfFiles(fullPath)
for file in filesInCurrDir:
fileName = file.split('/')
fileName = fileName[len(fileName)-1]
# print (fileName)
allFiles = allFiles + filesInCurrDir
else:
allFiles.append(fullPath)
return allFiles
dirName = './';
# Get the list of all files in directory tree at given path
listOfFiles = getListOfFiles(dirName)
# print (listOfFiles)

346
README.md
View File

@ -1,4 +1,5 @@
# The Algorithms - Python <!-- [![Build Status](https://travis-ci.org/TheAlgorithms/Python.svg)](https://travis-ci.org/TheAlgorithms/Python) -->
[![Donate](https://img.shields.io/badge/Donate-PayPal-green.svg)](https://www.paypal.me/TheAlgorithms/100) &nbsp;
[![Gitter chat](https://badges.gitter.im/gitterHQ/gitter.png)](https://gitter.im/TheAlgorithms) &nbsp;
[![Open in Gitpod](https://gitpod.io/button/open-in-gitpod.svg)](https://gitpod.io/#https://github.com/TheAlgorithms/Python)
@ -7,6 +8,17 @@
These implementations are for learning purposes. They may be less efficient than the implementations in the Python standard library.
## Owners
Anup Kumar Panwar
&nbsp; [[Gmail](mailto:1anuppanwar@gmail.com?Subject=The%20Algorithms%20-%20Python)
&nbsp; [Gihub](https://github.com/anupkumarpanwar)
&nbsp; [LinkedIn](https://www.linkedin.com/in/anupkumarpanwar/)]
Chetan Kaushik
&nbsp; [[Gmail](mailto:dynamitechetan@gmail.com?Subject=The%20Algorithms%20-%20Python)
&nbsp; [Gihub](https://github.com/dynamitechetan)
&nbsp; [LinkedIn](https://www.linkedin.com/in/chetankaushik/)]
## Contribution Guidelines
@ -15,3 +27,337 @@ Read our [Contribution Guidelines](CONTRIBUTING.md) before you contribute.
## Community Channel
We're on [Gitter](https://gitter.im/TheAlgorithms)! Please join us.
# Algorithms
## Hashes
- [Md5](./hashes/md5.py)
- [Chaos Machine](./hashes/chaos_machine.py)
- [Sha1](./hashes/sha1.py)
## File Transfer Protocol
- [Ftp Client Server](./file_transfer_protocol/ftp_client_server.py)
- [Ftp Send Receive](./file_transfer_protocol/ftp_send_receive.py)
## Backtracking
- [N Queens](./backtracking/n_queens.py)
- [Sum Of Subsets](./backtracking/sum_of_subsets.py)
## Ciphers
- [Transposition Cipher](./ciphers/transposition_cipher.py)
- [Atbash](./ciphers/Atbash.py)
- [Rot13](./ciphers/rot13.py)
- [Rabin Miller](./ciphers/rabin_miller.py)
- [Transposition Cipher Encrypt Decrypt File](./ciphers/transposition_cipher_encrypt_decrypt_file.py)
- [Affine Cipher](./ciphers/affine_cipher.py)
- [Trafid Cipher](./ciphers/trafid_cipher.py)
- [Base16](./ciphers/base16.py)
- [Elgamal Key Generator](./ciphers/elgamal_key_generator.py)
- [Rsa Cipher](./ciphers/rsa_cipher.py)
- [Prehistoric Men.txt](./ciphers/prehistoric_men.txt)
- [Vigenere Cipher](./ciphers/vigenere_cipher.py)
- [Xor Cipher](./ciphers/xor_cipher.py)
- [Brute Force Caesar Cipher](./ciphers/brute_force_caesar_cipher.py)
- [Rsa Key Generator](./ciphers/rsa_key_generator.py)
- [Simple Substitution Cipher](./ciphers/simple_substitution_cipher.py)
- [Playfair Cipher](./ciphers/playfair_cipher.py)
- [Morse Code Implementation](./ciphers/morse_Code_implementation.py)
- [Base32](./ciphers/base32.py)
- [Base85](./ciphers/base85.py)
- [Base64 Cipher](./ciphers/base64_cipher.py)
- [Onepad Cipher](./ciphers/onepad_cipher.py)
- [Caesar Cipher](./ciphers/caesar_cipher.py)
- [Hill Cipher](./ciphers/hill_cipher.py)
- [Cryptomath Module](./ciphers/cryptomath_module.py)
## Arithmetic Analysis
- [Bisection](./arithmetic_analysis/bisection.py)
- [Newton Method](./arithmetic_analysis/newton_method.py)
- [Newton Raphson Method](./arithmetic_analysis/newton_raphson_method.py)
- [Intersection](./arithmetic_analysis/intersection.py)
- [Lu Decomposition](./arithmetic_analysis/lu_decomposition.py)
## Boolean Algebra
- [Quine Mc Cluskey](./boolean_algebra/quine_mc_cluskey.py)
## Traversals
- [Binary Tree Traversals](./traversals/binary_tree_traversals.py)
## Maths
- [Average](./maths/average.py)
- [Abs Max](./maths/abs_Max.py)
- [Average Median](./maths/average_median.py)
- [Trapezoidal Rule](./maths/trapezoidal_rule.py)
- [Prime Check](./maths/Prime_Check.py)
- [Modular Exponential](./maths/modular_exponential.py)
- [Newton Raphson](./maths/newton_raphson.py)
- [Factorial Recursive](./maths/factorial_recursive.py)
- [Extended Euclidean Algorithm](./maths/extended_euclidean_algorithm.py)
- [Greater Common Divisor](./maths/greater_common_divisor.py)
- [Fibonacci](./maths/fibonacci.py)
- [Find Lcm](./maths/find_lcm.py)
- [Find Max](./maths/Find_Max.py)
- [Fermat Little Theorem](./maths/fermat_little_theorem.py)
- [Factorial Python](./maths/factorial_python.py)
- [Fibonacci Sequence Recursion](./maths/fibonacci_sequence_recursion.py)
- [Sieve Of Eratosthenes](./maths/sieve_of_eratosthenes.py)
- [Abs Min](./maths/abs_Min.py)
- [Lucas Series](./maths/lucasSeries.py)
- [Segmented Sieve](./maths/segmented_sieve.py)
- [Find Min](./maths/Find_Min.py)
- [Abs](./maths/abs.py)
- [Simpson Rule](./maths/simpson_rule.py)
- [Basic Maths](./maths/basic_maths.py)
- [3n+1](./maths/3n+1.py)
- [Binary Exponentiation](./maths/Binary_Exponentiation.py)
## Digital Image Processing
- ## Filters
- [Median Filter](./digital_image_processing/filters/median_filter.py)
- [Gaussian Filter](./digital_image_processing/filters/gaussian_filter.py)
## Compression
- [Peak Signal To Noise Ratio](./compression_analysis/peak_signal_to_noise_ratio.py)
- [Huffman](./compression/huffman.py)
## Graphs
- [BFS Shortest Path](./graphs/bfs_shortest_path.py)
- [Directed And Undirected (Weighted) Graph](<./graphs/Directed_and_Undirected_(Weighted)_Graph.py>)
- [Minimum Spanning Tree Prims](./graphs/minimum_spanning_tree_prims.py)
- [Graph Matrix](./graphs/graph_matrix.py)
- [Basic Graphs](./graphs/basic_graphs.py)
- [Dijkstra 2](./graphs/dijkstra_2.py)
- [Tarjans Strongly Connected Components](./graphs/tarjans_scc.py)
- [Check Bipartite Graph BFS](./graphs/check_bipartite_graph_bfs.py)
- [Depth First Search](./graphs/depth_first_search.py)
- [Kahns Algorithm Long](./graphs/kahns_algorithm_long.py)
- [Breadth First Search](./graphs/breadth_first_search.py)
- [Dijkstra](./graphs/dijkstra.py)
- [Articulation Points](./graphs/articulation_points.py)
- [Bellman Ford](./graphs/bellman_ford.py)
- [Check Bipartite Graph Dfs](./graphs/check_bipartite_graph_dfs.py)
- [Strongly Connected Components Kosaraju](./graphs/scc_kosaraju.py)
- [Multi Hueristic Astar](./graphs/multi_hueristic_astar.py)
- [Page Rank](./graphs/page_rank.py)
- [Eulerian Path And Circuit For Undirected Graph](./graphs/Eulerian_path_and_circuit_for_undirected_graph.py)
- [Edmonds Karp Multiple Source And Sink](./graphs/edmonds_karp_multiple_source_and_sink.py)
- [Floyd Warshall](./graphs/floyd_warshall.py)
- [Minimum Spanning Tree Kruskal](./graphs/minimum_spanning_tree_kruskal.py)
- [Prim](./graphs/prim.py)
- [Kahns Algorithm Topo](./graphs/kahns_algorithm_topo.py)
- [BFS](./graphs/BFS.py)
- [Finding Bridges](./graphs/finding_bridges.py)
- [Graph List](./graphs/graph_list.py)
- [Dijkstra Algorithm](./graphs/dijkstra_algorithm.py)
- [A Star](./graphs/a_star.py)
- [Even Tree](./graphs/even_tree.py)
- [DFS](./graphs/DFS.py)
## Networking Flow
- [Minimum Cut](./networking_flow/minimum_cut.py)
- [Ford Fulkerson](./networking_flow/ford_fulkerson.py)
## Matrix
- [Matrix Operation](./matrix/matrix_operation.py)
- [Searching In Sorted Matrix](./matrix/searching_in_sorted_matrix.py)
- [Spiral Print](./matrix/spiral_print.py)
## Searches
- [Quick Select](./searches/quick_select.py)
- [Binary Search](./searches/binary_search.py)
- [Interpolation Search](./searches/interpolation_search.py)
- [Jump Search](./searches/jump_search.py)
- [Linear Search](./searches/linear_search.py)
- [Ternary Search](./searches/ternary_search.py)
- [Tabu Search](./searches/tabu_search.py)
- [Sentinel Linear Search](./searches/sentinel_linear_search.py)
## Conversions
- [Decimal To Binary](./conversions/decimal_to_binary.py)
- [Decimal To Octal](./conversions/decimal_to_octal.py)
## Dynamic Programming
- [Fractional Knapsack](./dynamic_programming/Fractional_Knapsack.py)
- [Sum Of Subset](./dynamic_programming/sum_of_subset.py)
- [Fast Fibonacci](./dynamic_programming/fast_fibonacci.py)
- [Bitmask](./dynamic_programming/bitmask.py)
- [Abbreviation](./dynamic_programming/abbreviation.py)
- [Rod Cutting](./dynamic_programming/rod_cutting.py)
- [Knapsack](./dynamic_programming/knapsack.py)
- [Max Sub Array](./dynamic_programming/max_sub_array.py)
- [Fibonacci](./dynamic_programming/fibonacci.py)
- [Minimum Partition](./dynamic_programming/minimum_partition.py)
- [K Means Clustering Tensorflow](./dynamic_programming/k_means_clustering_tensorflow.py)
- [Coin Change](./dynamic_programming/coin_change.py)
- [Subset Generation](./dynamic_programming/subset_generation.py)
- [Floyd Warshall](./dynamic_programming/floyd_warshall.py)
- [Longest Sub Array](./dynamic_programming/longest_sub_array.py)
- [Integer Partition](./dynamic_programming/integer_partition.py)
- [Matrix Chain Order](./dynamic_programming/matrix_chain_order.py)
- [Edit Distance](./dynamic_programming/edit_distance.py)
- [Longest Common Subsequence](./dynamic_programming/longest_common_subsequence.py)
- [Longest Increasing Subsequence O(nlogn)](<./dynamic_programming/longest_increasing_subsequence_O(nlogn).py>)
- [Longest Increasing Subsequence](./dynamic_programming/longest_increasing_subsequence.py)
## Divide And Conquer
- [Max Subarray Sum](./divide_and_conquer/max_subarray_sum.py)
- [Max Sub Array Sum](./divide_and_conquer/max_sub_array_sum.py)
- [Closest Pair Of Points](./divide_and_conquer/closest_pair_of_points.py)
## Strings
- [Knuth Morris Pratt](./strings/knuth_morris_pratt.py)
- [Rabin Karp](./strings/rabin_karp.py)
- [Naive String Search](./strings/naive_String_Search.py)
- [Levenshtein Distance](./strings/levenshtein_distance.py)
- [Min Cost String Conversion](./strings/min_cost_string_conversion.py)
- [Boyer Moore Search](./strings/Boyer_Moore_Search.py)
- [Manacher](./strings/manacher.py)
## Sorts
- [Quick Sort](./sorts/quick_sort.py)
- [Selection Sort](./sorts/selection_sort.py)
- [Bitonic Sort](./sorts/Bitonic_Sort.py)
- [Cycle Sort](./sorts/cycle_sort.py)
- [Comb Sort](./sorts/comb_sort.py)
- [Topological Sort](./sorts/topological_sort.py)
- [Merge Sort Fastest](./sorts/merge_sort_fastest.py)
- [Random Pivot Quick Sort](./sorts/random_pivot_quick_sort.py)
- [Heap Sort](./sorts/heap_sort.py)
- [Insertion Sort](./sorts/insertion_sort.py)
- [Counting Sort](./sorts/counting_sort.py)
- [Bucket Sort](./sorts/bucket_sort.py)
- [Quick Sort 3 Partition](./sorts/quick_sort_3_partition.py)
- [Bogo Sort](./sorts/bogo_sort.py)
- [Shell Sort](./sorts/shell_sort.py)
- [Pigeon Sort](./sorts/pigeon_sort.py)
- [Odd-Even Transposition Parallel](./sorts/Odd-Even_transposition_parallel.py)
- [Tree Sort](./sorts/tree_sort.py)
- [Cocktail Shaker Sort](./sorts/cocktail_shaker_sort.py)
- [Random Normal Distribution Quicksort](./sorts/random_normal_distribution_quicksort.py)
- [Wiggle Sort](./sorts/wiggle_sort.py)
- [Pancake Sort](./sorts/pancake_sort.py)
- [External Sort](./sorts/external_sort.py)
- [Tim Sort](./sorts/tim_sort.py)
- [Sorting Graphs.png](./sorts/sorting_graphs.png)
- [Radix Sort](./sorts/radix_sort.py)
- [Odd-Even Transposition Single-threaded](./sorts/Odd-Even_transposition_single-threaded.py)
- [Bubble Sort](./sorts/bubble_sort.py)
- [Gnome Sort](./sorts/gnome_sort.py)
- [Merge Sort](./sorts/merge_sort.py)
## Machine Learning
- [Perceptron](./machine_learning/perceptron.py)
- [Random Forest Classifier](./machine_learning/random_forest_classification/random_forest_classifier.ipynb)
- [NaiveBayes.ipynb](./machine_learning/NaiveBayes.ipynb)
- [Scoring Functions](./machine_learning/scoring_functions.py)
- [Logistic Regression](./machine_learning/logistic_regression.py)
- [Gradient Descent](./machine_learning/gradient_descent.py)
- [Linear Regression](./machine_learning/linear_regression.py)
- [Random Forest Regression](./machine_learning/random_forest_regression/random_forest_regression.py)
- [Random Forest Regression](./machine_learning/random_forest_regression/random_forest_regression.ipynb)
- [Reuters One Vs Rest Classifier.ipynb](./machine_learning/reuters_one_vs_rest_classifier.ipynb)
- [Decision Tree](./machine_learning/decision_tree.py)
- [Knn Sklearn](./machine_learning/knn_sklearn.py)
- [K Means Clust](./machine_learning/k_means_clust.py)
## Neural Network
- [Perceptron](./neural_network/perceptron.py)
- [Fully Connected Neural Network](./neural_network/fully_connected_neural_network.ipynb)
- [Convolution Neural Network](./neural_network/convolution_neural_network.py)
- [Back Propagation Neural Network](./neural_network/back_propagation_neural_network.py)
## Data Structures
- ## Binary Tree
- [Basic Binary Tree](./data_structures/binary_tree/basic_binary_tree.py)
- [Red Black Tree](./data_structures/binary_tree/red_black_tree.py)
- [Fenwick Tree](./data_structures/binary_tree/fenwick_tree.py)
- [Treap](./data_structures/binary_tree/treap.py)
- [AVL Tree](./data_structures/binary_tree/AVL_tree.py)
- [Segment Tree](./data_structures/binary_tree/segment_tree.py)
- [Lazy Segment Tree](./data_structures/binary_tree/lazy_segment_tree.py)
- [Binary Search Tree](./data_structures/binary_tree/binary_search_tree.py)
- ## Trie
- [Trie](./data_structures/trie/trie.py)
- ## Linked List
- [Swap Nodes](./data_structures/linked_list/swap_nodes.py)
- [Doubly Linked List](./data_structures/linked_list/doubly_linked_list.py)
- [Singly Linked List](./data_structures/linked_list/singly_linked_list.py)
- [Is Palindrome](./data_structures/linked_list/is_Palindrome.py)
- ## Stacks
- [Postfix Evaluation](./data_structures/stacks/postfix_evaluation.py)
- [Balanced Parentheses](./data_structures/stacks/balanced_parentheses.py)
- [Infix To Prefix Conversion](./data_structures/stacks/infix_to_prefix_conversion.py)
- [Stack](./data_structures/stacks/stack.py)
- [Infix To Postfix Conversion](./data_structures/stacks/infix_to_postfix_conversion.py)
- [Next Greater Element](./data_structures/stacks/next_greater_element.py)
- [Stock Span Problem](./data_structures/stacks/stock_span_problem.py)
- ## Queue
- [Queue On Pseudo Stack](./data_structures/queue/queue_on_pseudo_stack.py)
- [Double Ended Queue](./data_structures/queue/double_ended_queue.py)
- [Queue On List](./data_structures/queue/queue_on_list.py)
- ## Heap
- [Heap](./data_structures/heap/heap.py)
- ## Hashing
- [Hash Table With Linked List](./data_structures/hashing/hash_table_with_linked_list.py)
- [Quadratic Probing](./data_structures/hashing/quadratic_probing.py)
- [Hash Table](./data_structures/hashing/hash_table.py)
- [Double Hash](./data_structures/hashing/double_hash.py)
## Other
- [Detecting English Programmatically](./other/detecting_english_programmatically.py)
- [Fischer Yates Shuffle](./other/fischer_yates_shuffle.py)
- [Primelib](./other/primelib.py)
- [Binary Exponentiation 2](./other/binary_exponentiation_2.py)
- [Anagrams](./other/anagrams.py)
- [Palindrome](./other/palindrome.py)
- [Finding Primes](./other/finding_Primes.py)
- [Two Sum](./other/two_sum.py)
- [Password Generator](./other/password_generator.py)
- [Linear Congruential Generator](./other/linear_congruential_generator.py)
- [Frequency Finder](./other/frequency_finder.py)
- [Euclidean Gcd](./other/euclidean_gcd.py)
- [Word Patterns](./other/word_patterns.py)
- [Nested Brackets](./other/nested_brackets.py)
- [Binary Exponentiation](./other/binary_exponentiation.py)
- [Sierpinski Triangle](./other/sierpinski_triangle.py)
- [Game Of Life](./other/game_of_life.py)
- [Tower Of Hanoi](./other/tower_of_hanoi.py)

View File

Before

Width:  |  Height:  |  Size: 4.3 MiB

After

Width:  |  Height:  |  Size: 4.3 MiB

View File

Before

Width:  |  Height:  |  Size: 104 KiB

After

Width:  |  Height:  |  Size: 104 KiB

View File

Before

Width:  |  Height:  |  Size: 26 KiB

After

Width:  |  Height:  |  Size: 26 KiB

View File

Before

Width:  |  Height:  |  Size: 29 KiB

After

Width:  |  Height:  |  Size: 29 KiB

View File

Before

Width:  |  Height:  |  Size: 476 KiB

After

Width:  |  Height:  |  Size: 476 KiB

View File

Before

Width:  |  Height:  |  Size: 82 KiB

After

Width:  |  Height:  |  Size: 82 KiB

View File

@ -21,11 +21,11 @@ def psnr(original, contrast):
def main():
dir_path = os.path.dirname(os.path.realpath(__file__))
# Loading images (original image and compressed image)
original = cv2.imread(os.path.join(dir_path, 'original_image.png'))
contrast = cv2.imread(os.path.join(dir_path, 'compressed_image.png'), 1)
original = cv2.imread(os.path.join(dir_path, 'image_data/original_image.png'))
contrast = cv2.imread(os.path.join(dir_path, 'image_data/compressed_image.png'), 1)
original2 = cv2.imread(os.path.join(dir_path, 'PSNR-example-base.png'))
contrast2 = cv2.imread(os.path.join(dir_path, 'PSNR-example-comp-10.jpg'), 1)
original2 = cv2.imread(os.path.join(dir_path, 'image_data/PSNR-example-base.png'))
contrast2 = cv2.imread(os.path.join(dir_path, 'image_data/PSNR-example-comp-10.jpg'), 1)
# Value expected: 29.73dB
print("-- First Test --")

View File

@ -1,3 +0,0 @@
arr = [10, 20, 30, 40]
arr[1] = 30 # set element 1 (20) of array to 30
print(arr)

View File

@ -1,181 +0,0 @@
"""
An AVL tree
"""
from __future__ import print_function
class Node:
def __init__(self, label):
self.label = label
self._parent = None
self._left = None
self._right = None
self.height = 0
@property
def right(self):
return self._right
@right.setter
def right(self, node):
if node is not None:
node._parent = self
self._right = node
@property
def left(self):
return self._left
@left.setter
def left(self, node):
if node is not None:
node._parent = self
self._left = node
@property
def parent(self):
return self._parent
@parent.setter
def parent(self, node):
if node is not None:
self._parent = node
self.height = self.parent.height + 1
else:
self.height = 0
class AVL:
def __init__(self):
self.root = None
self.size = 0
def insert(self, value):
node = Node(value)
if self.root is None:
self.root = node
self.root.height = 0
self.size = 1
else:
# Same as Binary Tree
dad_node = None
curr_node = self.root
while True:
if curr_node is not None:
dad_node = curr_node
if node.label < curr_node.label:
curr_node = curr_node.left
else:
curr_node = curr_node.right
else:
node.height = dad_node.height
dad_node.height += 1
if node.label < dad_node.label:
dad_node.left = node
else:
dad_node.right = node
self.rebalance(node)
self.size += 1
break
def rebalance(self, node):
n = node
while n is not None:
height_right = n.height
height_left = n.height
if n.right is not None:
height_right = n.right.height
if n.left is not None:
height_left = n.left.height
if abs(height_left - height_right) > 1:
if height_left > height_right:
left_child = n.left
if left_child is not None:
h_right = (left_child.right.height
if (left_child.right is not None) else 0)
h_left = (left_child.left.height
if (left_child.left is not None) else 0)
if (h_left > h_right):
self.rotate_left(n)
break
else:
self.double_rotate_right(n)
break
else:
right_child = n.right
if right_child is not None:
h_right = (right_child.right.height
if (right_child.right is not None) else 0)
h_left = (right_child.left.height
if (right_child.left is not None) else 0)
if (h_left > h_right):
self.double_rotate_left(n)
break
else:
self.rotate_right(n)
break
n = n.parent
def rotate_left(self, node):
aux = node.parent.label
node.parent.label = node.label
node.parent.right = Node(aux)
node.parent.right.height = node.parent.height + 1
node.parent.left = node.right
def rotate_right(self, node):
aux = node.parent.label
node.parent.label = node.label
node.parent.left = Node(aux)
node.parent.left.height = node.parent.height + 1
node.parent.right = node.right
def double_rotate_left(self, node):
self.rotate_right(node.getRight().getRight())
self.rotate_left(node)
def double_rotate_right(self, node):
self.rotate_left(node.getLeft().getLeft())
self.rotate_right(node)
def empty(self):
if self.root is None:
return True
return False
def preShow(self, curr_node):
if curr_node is not None:
self.preShow(curr_node.left)
print(curr_node.label, end=" ")
self.preShow(curr_node.right)
def preorder(self, curr_node):
if curr_node is not None:
self.preShow(curr_node.left)
self.preShow(curr_node.right)
print(curr_node.label, end=" ")
def getRoot(self):
return self.root
t = AVL()
t.insert(1)
t.insert(2)
t.insert(3)
# t.preShow(t.root)
# print("\n")
# t.insert(4)
# t.insert(5)
# t.preShow(t.root)
# t.preorden(t.root)

View File

@ -1,6 +0,0 @@
from .hash_table import HashTable
class QuadraticProbing(HashTable):
def __init__(self):
super(self.__class__, self).__init__()

View File

@ -1,78 +0,0 @@
from __future__ import absolute_import
from .union_find import UnionFind
import unittest
class TestUnionFind(unittest.TestCase):
def test_init_with_valid_size(self):
uf = UnionFind(5)
self.assertEqual(uf.size, 5)
def test_init_with_invalid_size(self):
with self.assertRaises(ValueError):
uf = UnionFind(0)
with self.assertRaises(ValueError):
uf = UnionFind(-5)
def test_union_with_valid_values(self):
uf = UnionFind(10)
for i in range(11):
for j in range(11):
uf.union(i, j)
def test_union_with_invalid_values(self):
uf = UnionFind(10)
with self.assertRaises(ValueError):
uf.union(-1, 1)
with self.assertRaises(ValueError):
uf.union(11, 1)
def test_same_set_with_valid_values(self):
uf = UnionFind(10)
for i in range(11):
for j in range(11):
if i == j:
self.assertTrue(uf.same_set(i, j))
else:
self.assertFalse(uf.same_set(i, j))
uf.union(1, 2)
self.assertTrue(uf.same_set(1, 2))
uf.union(3, 4)
self.assertTrue(uf.same_set(3, 4))
self.assertFalse(uf.same_set(1, 3))
self.assertFalse(uf.same_set(1, 4))
self.assertFalse(uf.same_set(2, 3))
self.assertFalse(uf.same_set(2, 4))
uf.union(1, 3)
self.assertTrue(uf.same_set(1, 3))
self.assertTrue(uf.same_set(1, 4))
self.assertTrue(uf.same_set(2, 3))
self.assertTrue(uf.same_set(2, 4))
uf.union(4, 10)
self.assertTrue(uf.same_set(1, 10))
self.assertTrue(uf.same_set(2, 10))
self.assertTrue(uf.same_set(3, 10))
self.assertTrue(uf.same_set(4, 10))
def test_same_set_with_invalid_values(self):
uf = UnionFind(10)
with self.assertRaises(ValueError):
uf.same_set(-1, 1)
with self.assertRaises(ValueError):
uf.same_set(11, 0)
if __name__ == '__main__':
unittest.main()

View File

@ -1,87 +0,0 @@
class UnionFind():
"""
https://en.wikipedia.org/wiki/Disjoint-set_data_structure
The union-find is a disjoint-set data structure
You can merge two sets and tell if one set belongs to
another one.
It's used on the Kruskal Algorithm
(https://en.wikipedia.org/wiki/Kruskal%27s_algorithm)
The elements are in range [0, size]
"""
def __init__(self, size):
if size <= 0:
raise ValueError("size should be greater than 0")
self.size = size
# The below plus 1 is because we are using elements
# in range [0, size]. It makes more sense.
# Every set begins with only itself
self.root = [i for i in range(size+1)]
# This is used for heuristic union by rank
self.weight = [0 for i in range(size+1)]
def union(self, u, v):
"""
Union of the sets u and v.
Complexity: log(n).
Amortized complexity: < 5 (it's very fast).
"""
self._validate_element_range(u, "u")
self._validate_element_range(v, "v")
if u == v:
return
# Using union by rank will guarantee the
# log(n) complexity
rootu = self._root(u)
rootv = self._root(v)
weight_u = self.weight[rootu]
weight_v = self.weight[rootv]
if weight_u >= weight_v:
self.root[rootv] = rootu
if weight_u == weight_v:
self.weight[rootu] += 1
else:
self.root[rootu] = rootv
def same_set(self, u, v):
"""
Return true if the elements u and v belongs to
the same set
"""
self._validate_element_range(u, "u")
self._validate_element_range(v, "v")
return self._root(u) == self._root(v)
def _root(self, u):
"""
Get the element set root.
This uses the heuristic path compression
See wikipedia article for more details.
"""
if u != self.root[u]:
self.root[u] = self._root(self.root[u])
return self.root[u]
def _validate_element_range(self, u, element_name):
"""
Raises ValueError if element is not in range
"""
if u < 0 or u > self.size:
msg = ("element {0} with value {1} "
"should be in range [0~{2}]")\
.format(element_name, u, self.size)
raise ValueError(msg)

View File

@ -1,29 +0,0 @@
"""Tower of Hanoi."""
# @author willx75
# Tower of Hanoi recursion game algorithm is a game, it consists of three rods
# and a number of disks of different sizes, which can slide onto any rod
import logging
log = logging.getLogger()
logging.basicConfig(level=logging.DEBUG)
def Tower_Of_Hanoi(n, source, dest, by, movement):
"""Tower of Hanoi - Move plates to different rods."""
if n == 0:
return n
elif n == 1:
movement += 1
# no print statement
# (you could make it an optional flag for printing logs)
logging.debug('Move the plate from', source, 'to', dest)
return movement
else:
movement = movement + Tower_Of_Hanoi(n - 1, source, by, dest, 0)
logging.debug('Move the plate from', source, 'to', dest)
movement = movement + 1 + Tower_Of_Hanoi(n - 1, by, dest, source, 0)
return movement

View File

@ -1 +0,0 @@
from .. import fibonacci

View File

@ -1,34 +0,0 @@
"""
To run with slash:
1. run pip install slash (may need to install C++ builds from Visual Studio website)
2. In the command prompt navigate to your project folder
3. then type--> slash run -vv -k tags:fibonacci ..
-vv indicates the level of verbosity (how much stuff you want the test to spit out after running)
-k is a way to select the tests you want to run. This becomes much more important in large scale projects.
"""
import slash
from .. import fibonacci
default_fib = [0, 1, 1, 2, 3, 5, 8]
@slash.tag('fibonacci')
@slash.parametrize(('n', 'seq'), [(2, [0, 1]), (3, [0, 1, 1]), (9, [0, 1, 1, 2, 3, 5, 8, 13, 21])])
def test_different_sequence_lengths(n, seq):
"""Test output of varying fibonacci sequence lengths"""
iterative = fibonacci.fib_iterative(n)
formula = fibonacci.fib_formula(n)
assert iterative == seq
assert formula == seq
@slash.tag('fibonacci')
@slash.parametrize('n', [7.3, 7.8, 7.0])
def test_float_input_iterative(n):
"""Test when user enters a float value"""
iterative = fibonacci.fib_iterative(n)
formula = fibonacci.fib_formula(n)
assert iterative == default_fib
assert formula == default_fib

Binary file not shown.

Before

Width:  |  Height:  |  Size: 224 KiB

View File

@ -1,93 +0,0 @@
import unittest
from interpolation_search import interpolation_search, interpolation_search_by_recursion
class Test_interpolation_search(unittest.TestCase):
def setUp(self):
# un-sorted case
self.collection1 = [5,3,4,6,7]
self.item1 = 4
# sorted case, result exists
self.collection2 = [10,30,40,45,50,66,77,93]
self.item2 = 66
# sorted case, result doesn't exist
self.collection3 = [10,30,40,45,50,66,77,93]
self.item3 = 67
# equal elements case, result exists
self.collection4 = [10,10,10,10,10]
self.item4 = 10
# equal elements case, result doesn't exist
self.collection5 = [10,10,10,10,10]
self.item5 = 3
# 1 element case, result exists
self.collection6 = [10]
self.item6 = 10
# 1 element case, result doesn't exists
self.collection7 = [10]
self.item7 = 1
def tearDown(self):
pass
def test_interpolation_search(self):
self.assertEqual(interpolation_search(self.collection1, self.item1), None)
self.assertEqual(interpolation_search(self.collection2, self.item2), self.collection2.index(self.item2))
self.assertEqual(interpolation_search(self.collection3, self.item3), None)
self.assertEqual(interpolation_search(self.collection4, self.item4), self.collection4.index(self.item4))
self.assertEqual(interpolation_search(self.collection5, self.item5), None)
self.assertEqual(interpolation_search(self.collection6, self.item6), self.collection6.index(self.item6))
self.assertEqual(interpolation_search(self.collection7, self.item7), None)
class Test_interpolation_search_by_recursion(unittest.TestCase):
def setUp(self):
# un-sorted case
self.collection1 = [5,3,4,6,7]
self.item1 = 4
# sorted case, result exists
self.collection2 = [10,30,40,45,50,66,77,93]
self.item2 = 66
# sorted case, result doesn't exist
self.collection3 = [10,30,40,45,50,66,77,93]
self.item3 = 67
# equal elements case, result exists
self.collection4 = [10,10,10,10,10]
self.item4 = 10
# equal elements case, result doesn't exist
self.collection5 = [10,10,10,10,10]
self.item5 = 3
# 1 element case, result exists
self.collection6 = [10]
self.item6 = 10
# 1 element case, result doesn't exists
self.collection7 = [10]
self.item7 = 1
def tearDown(self):
pass
def test_interpolation_search_by_recursion(self):
self.assertEqual(interpolation_search_by_recursion(self.collection1, self.item1, 0, len(self.collection1)-1), None)
self.assertEqual(interpolation_search_by_recursion(self.collection2, self.item2, 0, len(self.collection2)-1), self.collection2.index(self.item2))
self.assertEqual(interpolation_search_by_recursion(self.collection3, self.item3, 0, len(self.collection3)-1), None)
self.assertEqual(interpolation_search_by_recursion(self.collection4, self.item4, 0, len(self.collection4)-1), self.collection4.index(self.item4))
self.assertEqual(interpolation_search_by_recursion(self.collection5, self.item5, 0, len(self.collection5)-1), None)
self.assertEqual(interpolation_search_by_recursion(self.collection6, self.item6, 0, len(self.collection6)-1), self.collection6.index(self.item6))
self.assertEqual(interpolation_search_by_recursion(self.collection7, self.item7, 0, len(self.collection7)-1), None)
if __name__ == '__main__':
unittest.main()

View File

@ -1,46 +0,0 @@
import unittest
import os
from tabu_search import generate_neighbours, generate_first_solution, find_neighborhood, tabu_search
TEST_FILE = os.path.join(os.path.dirname(__file__), './tabu_test_data.txt')
NEIGHBOURS_DICT = {'a': [['b', '20'], ['c', '18'], ['d', '22'], ['e', '26']],
'c': [['a', '18'], ['b', '10'], ['d', '23'], ['e', '24']],
'b': [['a', '20'], ['c', '10'], ['d', '11'], ['e', '12']],
'e': [['a', '26'], ['b', '12'], ['c', '24'], ['d', '40']],
'd': [['a', '22'], ['b', '11'], ['c', '23'], ['e', '40']]}
FIRST_SOLUTION = ['a', 'c', 'b', 'd', 'e', 'a']
DISTANCE = 105
NEIGHBOURHOOD_OF_SOLUTIONS = [['a', 'e', 'b', 'd', 'c', 'a', 90],
['a', 'c', 'd', 'b', 'e', 'a', 90],
['a', 'd', 'b', 'c', 'e', 'a', 93],
['a', 'c', 'b', 'e', 'd', 'a', 102],
['a', 'c', 'e', 'd', 'b', 'a', 113],
['a', 'b', 'c', 'd', 'e', 'a', 119]]
class TestClass(unittest.TestCase):
def test_generate_neighbours(self):
neighbours = generate_neighbours(TEST_FILE)
self.assertEqual(NEIGHBOURS_DICT, neighbours)
def test_generate_first_solutions(self):
first_solution, distance = generate_first_solution(TEST_FILE, NEIGHBOURS_DICT)
self.assertEqual(FIRST_SOLUTION, first_solution)
self.assertEqual(DISTANCE, distance)
def test_find_neighbours(self):
neighbour_of_solutions = find_neighborhood(FIRST_SOLUTION, NEIGHBOURS_DICT)
self.assertEqual(NEIGHBOURHOOD_OF_SOLUTIONS, neighbour_of_solutions)
def test_tabu_search(self):
best_sol, best_cost = tabu_search(FIRST_SOLUTION, DISTANCE, NEIGHBOURS_DICT, 4, 3)
self.assertEqual(['a', 'd', 'b', 'e', 'c', 'a'], best_sol)
self.assertEqual(87, best_cost)

View File

@ -1,6 +0,0 @@
# simple client server
#### Note:
- Run **`server.py`** first.
- Now, run **`client.py`**.
- verify the output.

View File

@ -1,29 +0,0 @@
# client.py
import socket
HOST, PORT = '127.0.0.1', 1400
s = socket.socket(
socket.AF_INET, # ADDRESS FAMILIES
#Name Purpose
#AF_UNIX, AF_LOCAL Local communication
#AF_INET IPv4 Internet protocols
#AF_INET6 IPv6 Internet protocols
#AF_APPLETALK Appletalk
#AF_BLUETOOTH Bluetooth
socket.SOCK_STREAM # SOCKET TYPES
#Name Way of Interaction
#SOCK_STREAM TCP
#SOCK_DGRAM UDP
)
s.connect((HOST, PORT))
s.send('Hello World'.encode('ascii'))#in UDP use sendto()
data = s.recv(1024)#in UDP use recvfrom()
s.close()#end the connection
print(repr(data.decode('ascii')))

View File

@ -1,21 +0,0 @@
# server.py
import socket
HOST, PORT = '127.0.0.1', 1400
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)#refer to client.py
s.bind((HOST, PORT))
s.listen(1)#listen for 1 connection
conn, addr = s.accept()#start the actual data flow
print('connected to:', addr)
while 1:
data = conn.recv(1024).decode('ascii')#receive 1024 bytes and decode using ascii
if not data:
break
conn.send((data + ' [ addition by server ]').encode('ascii'))
conn.close()

Binary file not shown.

Before

Width:  |  Height:  |  Size: 10 KiB

View File

@ -1,76 +0,0 @@
"""Test Sort Algorithms for Errors."""
from bogo_sort import bogo_sort
from bubble_sort import bubble_sort
from bucket_sort import bucket_sort
from cocktail_shaker_sort import cocktail_shaker_sort
from comb_sort import comb_sort
from counting_sort import counting_sort
from cycle_sort import cycle_sort
from gnome_sort import gnome_sort
from heap_sort import heap_sort
from insertion_sort import insertion_sort
from merge_sort_fastest import merge_sort as merge_sort_fastest
from merge_sort import merge_sort
from pancake_sort import pancake_sort
from quick_sort_3_partition import quick_sort_3partition
from quick_sort import quick_sort
from radix_sort import radix_sort
from random_pivot_quick_sort import quick_sort_random
from selection_sort import selection_sort
from shell_sort import shell_sort
from tim_sort import tim_sort
from topological_sort import topological_sort
from tree_sort import tree_sort
from wiggle_sort import wiggle_sort
TEST_CASES = [
{'input': [8, 7, 6, 5, 4, 3, -2, -5], 'expected': [-5, -2, 3, 4, 5, 6, 7, 8]},
{'input': [-5, -2, 3, 4, 5, 6, 7, 8], 'expected': [-5, -2, 3, 4, 5, 6, 7, 8]},
{'input': [5, 6, 1, 4, 0, 1, -2, -5, 3, 7], 'expected': [-5, -2, 0, 1, 1, 3, 4, 5, 6, 7]},
{'input': [2, -2], 'expected': [-2, 2]},
{'input': [1], 'expected': [1]},
{'input': [], 'expected': []},
]
'''
TODO:
- Fix some broken tests in particular cases (as [] for example),
- Unify the input format: should always be function(input_collection) (no additional args)
- Unify the output format: should always be a collection instead of
updating input elements and returning None
- Rewrite some algorithms in function format (in case there is no function definition)
'''
TEST_FUNCTIONS = [
bogo_sort,
bubble_sort,
bucket_sort,
cocktail_shaker_sort,
comb_sort,
counting_sort,
cycle_sort,
gnome_sort,
heap_sort,
insertion_sort,
merge_sort_fastest,
merge_sort,
pancake_sort,
quick_sort_3partition,
quick_sort,
radix_sort,
quick_sort_random,
selection_sort,
shell_sort,
tim_sort,
topological_sort,
tree_sort,
wiggle_sort,
]
for function in TEST_FUNCTIONS:
for case in TEST_CASES:
result = function(case['input'])
assert result == case['expected'], 'Executed function: {}, {} != {}'.format(function.__name__, result, case['expected'])