signature
stringlengths
8
3.44k
body
stringlengths
0
1.41M
docstring
stringlengths
1
122k
id
stringlengths
5
17
@staticmethod<EOL><INDENT>def parse(readDataInstance):<DEDENT>
dbgDir = ImageDebugDirectory()<EOL>dbgDir.characteristics.value = readDataInstance.readDword()<EOL>dbgDir.timeDateStamp.value = readDataInstance.readDword()<EOL>dbgDir.majorVersion.value = readDataInstance.readWord()<EOL>dbgDir.minorVersion.value = readDataInstance.readWord()<EOL>dbgDir.type.value = readDataInstance.re...
Returns a new L{ImageDebugDirectory} object. @type readDataInstance: L{ReadData} @param readDataInstance: A new L{ReadData} object with data to be parsed as a L{ImageDebugDirectory} object. @rtype: L{ImageDebugDirectory} @return: A new L{ImageDebugDirectory} object.
f11794:c10:m2
def __init__(self, shouldPack = True):
self.shouldPack = shouldPack<EOL>
Array of L{ImageDebugDirectory} objects. @type shouldPack: bool @param shouldPack: (Optional) If set to c{True}, the object will be packed. If set to C{False}, the object won't be packed.
f11794:c11:m0
def getType(self):
return consts.IMAGE_DEBUG_DIRECTORIES<EOL>
Returns L{consts.IMAGE_DEBUG_DIRECTORIES}.
f11794:c11:m2
@staticmethod<EOL><INDENT>def parse(readDataInstance, nDebugEntries):<DEDENT>
dbgEntries = ImageDebugDirectories()<EOL>dataLength = len(readDataInstance)<EOL>toRead = nDebugEntries * consts.SIZEOF_IMAGE_DEBUG_ENTRY32<EOL>if dataLength >= toRead:<EOL><INDENT>for i in range(nDebugEntries):<EOL><INDENT>dbgEntry = ImageDebugDirectory.parse(readDataInstance)<EOL>dbgEntries.append(dbgEntry)<EOL><DEDEN...
Returns a new L{ImageDebugDirectories} object. @type readDataInstance: L{ReadData} @param readDataInstance: A L{ReadData} object with data to be parsed as a L{ImageDebugDirectories} object. @type nDebugEntries: int @param nDebugEntries: Number of L{ImageDebugDirectory} objects in the C{readDataInstance} object. @rty...
f11794:c11:m3
def __init__(self, shouldPack = True):
baseclasses.BaseStructClass.__init__(self, shouldPack)<EOL>self.moduleName = datatypes.String("<STR_LIT>") <EOL>self.numberOfImports = datatypes.DWORD(<NUM_LIT:0>) <EOL>self._attrsList = ["<STR_LIT>", "<STR_LIT>"]<EOL>
Class used to store metadata from the L{ImageImportDescriptor} object. @type shouldPack: bool @param shouldPack: (Optional) If set to c{True}, the object will be packed. If set to C{False}, the object won't be packed.
f11794:c12:m0
def getType(self):
return consts.IID_METADATA<EOL>
Returns L{consts.IID_METADATA}.
f11794:c12:m1
def __init__(self, shouldPack = True):
baseclasses.BaseStructClass.__init__(self, shouldPack)<EOL>self.metaData = ImageImportDescriptorMetaData() <EOL>self.originalFirstThunk = datatypes.DWORD(<NUM_LIT:0>) <EOL>self.timeDateStamp = datatypes.DWORD(<NUM_LIT:0>) <EOL>self.forwarderChain = datatypes.DWORD(<NUM_LIT:0>) <EOL>self.name = datatypes.DWORD(<NUM_LIT:...
Class representation of a C{IMAGE_IMPORT_DESCRIPTOR} structure. @see: Figure 5 U{http://msdn.microsoft.com/es-ar/magazine/bb985996%28en-us%29.aspx} @type shouldPack: bool @param shouldPack: (Optional) If set to c{True}, the object will be packed. If set to C{False}, the object won't be packed.
f11794:c13:m0
@staticmethod<EOL><INDENT>def parse(readDataInstance):<DEDENT>
iid = ImageImportDescriptorEntry()<EOL>iid.originalFirstThunk.value = readDataInstance.readDword()<EOL>iid.timeDateStamp.value = readDataInstance.readDword()<EOL>iid.forwarderChain.value = readDataInstance.readDword()<EOL>iid.name.value = readDataInstance.readDword()<EOL>iid.firstThunk.value = readDataInstance.readDwor...
Returns a new L{ImageImportDescriptorEntry} object. @type readDataInstance: L{ReadData} @param readDataInstance: A L{ReadData} object with data to be parsed as a L{ImageImportDescriptorEntry}. @rtype: L{ImageImportDescriptorEntry} @return: A new L{ImageImportDescriptorEntry} object.
f11794:c13:m1
def getType(self):
return consts.IMAGE_IMPORT_DESCRIPTOR_ENTRY<EOL>
Returns C{consts.IMAGE_IMPORT_DESCRIPTOR_ENTRY}.
f11794:c13:m2
def __init__(self, shouldPack = True):
self.shouldPack = shouldPack<EOL>
Array of L{ImageImportDescriptorEntry} objects. @type shouldPack: bool @param shouldPack: (Optional) If set to c{True}, the object will be packed. If set to C{False}, the object won't be packed.
f11794:c14:m0
def getType(self):
return consts.IMAGE_IMPORT_DESCRIPTOR<EOL>
Returns L{consts.IMAGE_IMPORT_DESCRIPTOR}.
f11794:c14:m2
@staticmethod<EOL><INDENT>def parse(readDataInstance, nEntries):<DEDENT>
importEntries = ImageImportDescriptor()<EOL>dataLength = len(readDataInstance)<EOL>toRead = nEntries * consts.SIZEOF_IMAGE_IMPORT_ENTRY32<EOL>if dataLength >= toRead:<EOL><INDENT>for i in range(nEntries):<EOL><INDENT>importEntry = ImageImportDescriptorEntry.parse(readDataInstance)<EOL>importEntries.append(importEntry)<...
Returns a new L{ImageImportDescriptor} object. @type readDataInstance: L{ReadData} @param readDataInstance: A L{ReadData} object with data to be parsed as a L{ImageImportDescriptor} object. @type nEntries: int @param nEntries: The number of L{ImageImportDescriptorEntry} objects in the C{readDataInstance} object. @rt...
f11794:c14:m3
def __init__(self, shouldPack = True):
baseclasses.BaseStructClass.__init__(self, shouldPack)<EOL>self.firstThunk = datatypes.DWORD(<NUM_LIT:0>) <EOL>self.originalFirstThunk = datatypes.DWORD(<NUM_LIT:0>) <EOL>self.hint = datatypes.WORD(<NUM_LIT:0>) <EOL>self.name = datatypes.String("<STR_LIT>") <EOL>self._attrsList = ["<STR_LIT>", "<STR_LIT>", "<STR_LIT>...
A class representation of a C{} structure. @type shouldPack: bool @param shouldPack: (Optional) If set to c{True}, the object will be packed. If set to C{False}, the object won't be packed.
f11794:c15:m0
def getType(self):
return consts.IMPORT_ADDRESS_TABLE_ENTRY<EOL>
Returns L{consts.IMPORT_ADDRESS_TABLE_ENTRY}.
f11794:c15:m1
def __init__(self, shouldPack = True):
baseclasses.BaseStructClass.__init__(self, shouldPack)<EOL>self.firstThunk = datatypes.QWORD(<NUM_LIT:0>) <EOL>self.originalFirstThunk = datatypes.QWORD(<NUM_LIT:0>) <EOL>self.hint = datatypes.WORD(<NUM_LIT:0>) <EOL>self.name = datatypes.String("<STR_LIT>") <EOL>self._attrsList = ["<STR_LIT>", "<STR_LIT>", "<STR_LIT>...
A class representation of a C{} structure. @type shouldPack: bool @param shouldPack: (Optional) If set to c{True}, the object will be packed. If set to C{False}, the object won't be packed.
f11794:c16:m0
def getType(self):
return consts.IMPORT_ADDRESS_TABLE_ENTRY64<EOL>
Returns L{consts.IMPORT_ADDRESS_TABLE_ENTRY64}.
f11794:c16:m1
def __init__(self, shouldPack = True):
baseclasses.BaseStructClass.__init__(self, shouldPack)<EOL>self.ordinal = datatypes.DWORD(<NUM_LIT:0>) <EOL>self.functionRva = datatypes.DWORD(<NUM_LIT:0>) <EOL>self.nameOrdinal = datatypes.WORD(<NUM_LIT:0>) <EOL>self.nameRva = datatypes.DWORD(<NUM_LIT:0>) <EOL>self.name = datatypes.String("<STR_LIT>") <EOL>self._attr...
A class representation of a C{} structure. @type shouldPack: bool @param shouldPack: (Optional) If set to c{True}, the object will be packed. If set to C{False}, the object won't be packed.
f11794:c19:m0
def getType(self):
return consts.EXPORT_TABLE_ENTRY<EOL>
Returns L{consts.EXPORT_TABLE_ENTRY}.
f11794:c19:m2
@staticmethod<EOL><INDENT>def parse(readDataInstance):<DEDENT>
exportEntry = ExportTableEntry()<EOL>exportEntry.functionRva.value = readDataInstance.readDword()<EOL>exportEntry.nameOrdinal.value = readDataInstance.readWord()<EOL>exportEntry.nameRva.value = readDataInstance.readDword()<EOL>exportEntry.name.value = readDataInstance.readString()<EOL>return exportEntry<EOL>
Returns a new L{ExportTableEntry} object. @type readDataInstance: L{ReadData} @param readDataInstance: A L{ReadData} object with data to be parsed as a L{ExportTableEntry} object. @rtype: L{ExportTableEntry} @return: A new L{ExportTableEntry} object.
f11794:c19:m3
def __init__(self, shouldPack = True):
baseclasses.BaseStructClass.__init__(self, shouldPack)<EOL>self.exportTable = ExportTable()<EOL>self.characteristics = datatypes.DWORD(<NUM_LIT:0>) <EOL>self.timeDateStamp = datatypes.DWORD(<NUM_LIT:0>) <EOL>self.majorVersion = datatypes.WORD(<NUM_LIT:0>) <EOL>self.minorVersion = datatypes.WORD(<NUM_LIT:0>) <EOL>self....
Class representation of a C{IMAGE_EXPORT_DIRECTORY} structure. @see: Figure 2 U{http://msdn.microsoft.com/en-us/magazine/bb985996.aspx} @type shouldPack: bool @param shouldPack: (Optional) If set to c{True}, the object will be packed. If set to C{False}, the object won't be packed.
f11794:c20:m0
def getType(self):
return consts.EXPORT_DIRECTORY<EOL>
Returns L{consts.EXPORT_DIRECTORY}.
f11794:c20:m1
@staticmethod<EOL><INDENT>def parse(readDataInstance):<DEDENT>
et = ImageExportTable()<EOL>et.characteristics.value = readDataInstance.readDword()<EOL>et.timeDateStamp.value = readDataInstance.readDword()<EOL>et.majorVersion.value = readDataInstance.readWord()<EOL>et.minorVersion.value = readDataInstance.readWord()<EOL>et.name.value = readDataInstance.readDword()<EOL>et.base.value...
Returns a new L{ImageExportTable} object. @type readDataInstance: L{ReadData} @param readDataInstance: A L{ReadData} object with data to be parsed as a L{ImageExportTable} object. @rtype: L{ImageExportTable} @return: A new L{ImageExportTable} object.
f11794:c20:m2
def __init__(self, shouldPack = True):
baseclasses.BaseStructClass.__init__(self, shouldPack)<EOL>self.directory = NetDirectory() <EOL>self.netMetaDataHeader = NetMetaDataHeader() <EOL>self.netMetaDataStreams = NetMetaDataStreams() <EOL>self._attrsList = ["<STR_LIT>", "<STR_LIT>", "<STR_LIT>"]<EOL>
A class to abstract data from the .NET PE format. @type shouldPack: bool @param shouldPack: (Optional) If set to c{True}, the object will be packed. If set to C{False}, the object won't be packed.
f11794:c21:m0
@staticmethod<EOL><INDENT>def parse(readDataInstance):<DEDENT>
nd = NETDirectory()<EOL>nd.directory = NetDirectory.parse(readDataInstance)<EOL>nd.netMetaDataHeader = NetMetaDataHeader.parse(readDataInstance)<EOL>nd.netMetaDataStreams = NetMetaDataStreams.parse(readDataInstance)<EOL>return nd<EOL>
Returns a new L{NETDirectory} object. @type readDataInstance: L{ReadData} @param readDataInstance: A L{ReadData} object with data to be parsed as a L{NETDirectory} object. @rtype: L{NETDirectory} @return: A new L{NETDirectory} object.
f11794:c21:m1
def getType(self):
return consts.NET_DIRECTORY<EOL>
Returns L{consts.NET_DIRECTORY}.
f11794:c21:m2
def __init__(self, shouldPack = True):
baseclasses.BaseStructClass.__init__(self, shouldPack)<EOL>self.cb = datatypes.DWORD(<NUM_LIT:0>) <EOL>self.majorRuntimeVersion = datatypes.WORD(<NUM_LIT:0>) <EOL>self.minorRuntimeVersion = datatypes.WORD(<NUM_LIT:0>) <EOL>self.metaData = datadirs.Directory() <EOL>self.flags = datatypes.DWORD(<NUM_LIT:0>) <EOL>self.en...
A class representation of the C{IMAGE_COR20_HEADER} structure. @see: U{http://www.ntcore.com/files/dotnetformat.htm} @type shouldPack: bool @param shouldPack: (Optional) If set to c{True}, the object will be packed. If set to C{False}, the object won't be packed.
f11794:c22:m0
def getType(self):
return consts.IMAGE_COR20_HEADER<EOL>
Returns L{consts.IMAGE_COR20_HEADER}.
f11794:c22:m1
@staticmethod<EOL><INDENT>def parse(readDataInstance):<DEDENT>
nd = NetDirectory()<EOL>nd.cb.value = readDataInstance.readDword()<EOL>nd.majorRuntimeVersion.value= readDataInstance.readWord()<EOL>nd.minorRuntimeVersion.value = readDataInstance.readWord()<EOL>nd.metaData.rva.value = readDataInstance.readDword()<EOL>nd.metaData.size.value = readDataInstance.readDword()<EOL>nd.metaDa...
Returns a new L{NetDirectory} object. @type readDataInstance: L{ReadData} @param readDataInstance: A L{ReadData} object with data to be parsed as a L{NetDirectory} object. @rtype: L{NetDirectory} @return: A new L{NetDirectory} object.
f11794:c22:m2
def getType(self):
return consts.NET_METADATA_HEADER<EOL>
Returns L{consts.NET_METADATA_HEADER}.
f11794:c23:m1
@staticmethod<EOL><INDENT>def parse(readDataInstance):<DEDENT>
nmh = NetMetaDataHeader()<EOL>nmh.signature.value = readDataInstance.readDword()<EOL>nmh.majorVersion.value = readDataInstance.readWord()<EOL>nmh.minorVersion.value = readDataInstance.readWord()<EOL>nmh.reserved.value = readDataInstance.readDword()<EOL>nmh.versionLength.value = readDataInstance.readDword()<EOL>nmh.vers...
Returns a new L{NetMetaDataHeader} object. @type readDataInstance: L{ReadData} @param readDataInstance: A L{ReadData} object with data to be parsed as a L{NetMetaDataHeader} object. @rtype: L{NetMetaDataHeader} @return: A new L{NetMetaDataHeader} object.
f11794:c23:m2
def getType(self):
return consts.NET_METADATA_STREAM_ENTRY<EOL>
Returns L{consts.NET_METADATA_STREAM_ENTRY}.
f11794:c24:m1
@staticmethod<EOL><INDENT>def parse(readDataInstance):<DEDENT>
n = NetMetaDataStreamEntry()<EOL>n.offset.value = readDataInstance.readDword()<EOL>n.size.value = readDataInstance.readDword()<EOL>n.name.value = readDataInstance.readAlignedString()<EOL>return n<EOL>
Returns a new L{NetMetaDataStreamEntry} object. @type readDataInstance: L{ReadData} @param readDataInstance: A L{ReadData} object with data to be parsed as a L{NetMetaDataStreamEntry}. @rtype: L{NetMetaDataStreamEntry} @return: A new L{NetMetaDataStreamEntry} object.
f11794:c24:m2
def getType(self):
return consts.NET_METADATA_STREAMS<EOL>
Returns L{consts.NET_METADATA_STREAMS}.
f11794:c25:m4
@staticmethod<EOL><INDENT>def parse(readDataInstance, nStreams):<DEDENT>
streams = NetMetaDataStreams()<EOL>for i in range(nStreams):<EOL><INDENT>streamEntry = NetMetaDataStreamEntry()<EOL>streamEntry.offset.value = readDataInstance.readDword()<EOL>streamEntry.size.value = readDataInstance.readDword()<EOL>streamEntry.name.value = readDataInstance.readAlignedString()<EOL>streams.update({ i: ...
Returns a new L{NetMetaDataStreams} object. @type readDataInstance: L{ReadData} @param readDataInstance: A L{ReadData} object with data to be parsed as a L{NetMetaDataStreams} object. @type nStreams: int @param nStreams: The number of L{NetMetaDataStreamEntry} objects in the C{readDataInstance} object. @rtype: L{Net...
f11794:c25:m5
def getType(self):
return consts.NET_METADATA_TABLE_HEADER<EOL>
Returns L{consts.NET_METADATA_TABLE_HEADER}.
f11794:c26:m1
@staticmethod<EOL><INDENT>def parse(readDataInstance):<DEDENT>
th = NetMetaDataTableHeader()<EOL>th.reserved_1.value = readDataInstance.readDword()<EOL>th.majorVersion.value = readDataInstance.readByte()<EOL>th.minorVersion.value = readDataInstance.readByte()<EOL>th.heapOffsetSizes.value = readDataInstance.readByte()<EOL>th.reserved_2.value = readDataInstance.readByte()<EOL>th.mas...
Returns a new L{NetMetaDataTableHeader} object. @type readDataInstance: L{ReadData} @param readDataInstance: A L{ReadData} object with data to be parsed as a L{NetMetaDataTableHeader} object. @rtype: L{NetMetaDataTableHeader} @return: A new L{NetMetaDataTableHeader} object.
f11794:c26:m2
def __init__(self, shouldPack = True):
baseclasses.BaseStructClass.__init__(self, shouldPack)<EOL>self.netMetaDataTableHeader = NetMetaDataTableHeader() <EOL>self.tables = None <EOL>self._attrsList = ["<STR_LIT>", "<STR_LIT>"]<EOL>
NetMetaDataTables object. @todo: Parse every table in this struct and store them in the C{self.tables} attribute.
f11794:c27:m0
def getType(self):
return consts.NET_METADATA_TABLES<EOL>
Returns L{consts.NET_METADATA_TABLES}.
f11794:c27:m1
@staticmethod<EOL><INDENT>def parse(readDataInstance, netMetaDataStreams):<DEDENT>
dt = NetMetaDataTables()<EOL>dt.netMetaDataTableHeader = NetMetaDataTableHeader.parse(readDataInstance)<EOL>dt.tables = {}<EOL>metadataTableDefinitions = dotnet.MetadataTableDefinitions(dt, netMetaDataStreams)<EOL>for i in xrange(<NUM_LIT:64>):<EOL><INDENT>dt.tables[i] = { "<STR_LIT>": <NUM_LIT:0> }<EOL>if dt.netMetaDa...
Returns a new L{NetMetaDataTables} object. @type readDataInstance: L{ReadData} @param readDataInstance: A L{ReadData} object with data to be parsed as a L{NetMetaDataTables} object. @rtype: L{NetMetaDataTables} @return: A new L{NetMetaDataTables} object.
f11794:c27:m2
def __init__(self, shouldPack = True):
baseclasses.BaseStructClass.__init__(self, shouldPack)<EOL>self.signature = datatypes.DWORD(<NUM_LIT:0>)<EOL>self.readerCount = datatypes.DWORD(<NUM_LIT:0>)<EOL>self.readerTypeLength = datatypes.DWORD(<NUM_LIT:0>)<EOL>self.version = datatypes.DWORD(<NUM_LIT:0>)<EOL>self.resourceCount = datatypes.DWORD(<NUM_LIT:0>)<EOL...
NetResources object. @todo: Parse every resource in this struct and store them in the C{self.resources} attribute.
f11794:c28:m0
def getType(self):
return consts.NET_RESOURCES<EOL>
Returns L{consts.NET_RESOURCES}.
f11794:c28:m3
@staticmethod<EOL><INDENT>def parse(readDataInstance):<DEDENT>
r = NetResources()<EOL>r.signature = readDataInstance.readDword()<EOL>if r.signature != <NUM_LIT>:<EOL><INDENT>return r<EOL><DEDENT>r.readerCount = readDataInstance.readDword()<EOL>r.readerTypeLength = readDataInstance.readDword()<EOL>r.readerType = utils.ReadData(readDataInstance.read(r.readerTypeLength)).readDotNetBl...
Returns a new L{NetResources} object. @type readDataInstance: L{ReadData} @param readDataInstance: A L{ReadData} object with data to be parsed as a L{NetResources} object. @rtype: L{NetResources} @return: A new L{NetResources} object.
f11794:c28:m4
def __init__(self, shouldPack = True):
self.name = datatypes.String("<STR_LIT>")<EOL>self.rva = datatypes.DWORD(<NUM_LIT:0>) <EOL>self.size = datatypes.DWORD(<NUM_LIT:0>) <EOL>self.info = None <EOL>self.shouldPack = shouldPack<EOL>
Class representation of the C{IMAGE_DATA_DIRECTORY} structure. @see: U{http://msdn.microsoft.com/es-es/library/windows/desktop/ms680305%28v=vs.85%29.aspx} @type shouldPack: bool @param shouldPack: If set to C{True} the L{Directory} object will be packed. If set to C{False} the object won't be packed.
f11795:c0:m0
@staticmethod<EOL><INDENT>def parse(readDataInstance):<DEDENT>
d = Directory()<EOL>d.rva.value = readDataInstance.readDword()<EOL>d.size.value = readDataInstance.readDword()<EOL>return d<EOL>
Returns a L{Directory}-like object. @type readDataInstance: L{ReadData} @param readDataInstance: L{ReadData} object to read from. @rtype: L{Directory} @return: L{Directory} object.
f11795:c0:m4
def getType(self):
return consts.IMAGE_DATA_DIRECTORY<EOL>
Returns a value that identifies the L{Directory} object.
f11795:c0:m5
def __init__(self, shouldPack = True):
self.shouldPack = shouldPack<EOL>for i in range(consts.IMAGE_NUMBEROF_DIRECTORY_ENTRIES):<EOL><INDENT>dir = Directory()<EOL>dir.name.value = dirs[i]<EOL>self.append(dir)<EOL><DEDENT>
Array of L{Directory} objects. @type shouldPack: bool @param shouldPack: If set to C{True} the L{DataDirectory} object will be packed. If set to C{False} the object won't packed.
f11795:c1:m0
@staticmethod<EOL><INDENT>def parse(readDataInstance):<DEDENT>
if len(readDataInstance) == consts.IMAGE_NUMBEROF_DIRECTORY_ENTRIES * <NUM_LIT:8>:<EOL><INDENT>newDataDirectory = DataDirectory()<EOL>for i in range(consts.IMAGE_NUMBEROF_DIRECTORY_ENTRIES):<EOL><INDENT>newDataDirectory[i].name.value = dirs[i]<EOL>newDataDirectory[i].rva.value = readDataInstance.readDword()<EOL>newData...
Returns a L{DataDirectory}-like object. @type readDataInstance: L{ReadData} @param readDataInstance: L{ReadData} object to read from. @rtype: L{DataDirectory} @return: The L{DataDirectory} object containing L{consts.IMAGE_NUMBEROF_DIRECTORY_ENTRIES} L{Directory} objects. @rais...
f11795:c1:m2
def _generate_configs_from_default(self, overrides=None):<EOL>
config = DEFAULT_CONFIG.copy()<EOL>if not overrides:<EOL><INDENT>overrides = {}<EOL><DEDENT>for k, v in overrides.items():<EOL><INDENT>config[k] = v<EOL><DEDENT>return config<EOL>
Generate configs by inheriting from defaults
f11800:c0:m1
def read_ical(self, ical_file_location):
with open(ical_file_location, '<STR_LIT:r>') as ical_file:<EOL><INDENT>data = ical_file.read()<EOL><DEDENT>self.cal = Calendar.from_ical(data)<EOL>return self.cal<EOL>
Read the ical file
f11800:c0:m2
def read_csv(self, csv_location, csv_configs=None):<EOL>
csv_configs = self._generate_configs_from_default(csv_configs)<EOL>with open(csv_location, '<STR_LIT:r>') as csv_file:<EOL><INDENT>csv_reader = csv.reader(csv_file)<EOL>self.csv_data = list(csv_reader)<EOL><DEDENT>self.csv_data = self.csv_data[csv_configs['<STR_LIT>']:]<EOL>return self.csv_data<EOL>
Read the csv file
f11800:c0:m3
def make_ical(self, csv_configs=None):<EOL>
csv_configs = self._generate_configs_from_default(csv_configs)<EOL>self.cal = Calendar()<EOL>for row in self.csv_data:<EOL><INDENT>event = Event()<EOL>event.add('<STR_LIT>', row[csv_configs['<STR_LIT>']])<EOL>event.add('<STR_LIT>', row[csv_configs['<STR_LIT>']])<EOL>event.add('<STR_LIT>', row[csv_configs['<STR_LIT>']])...
Make iCal entries
f11800:c0:m4
def make_csv(self):
for event in self.cal.subcomponents:<EOL><INDENT>if event.name != '<STR_LIT>':<EOL><INDENT>continue<EOL><DEDENT>row = [<EOL>event.get('<STR_LIT>'),<EOL>event.get('<STR_LIT>').dt,<EOL>event.get('<STR_LIT>').dt,<EOL>event.get('<STR_LIT>'),<EOL>event.get('<STR_LIT>'),<EOL>]<EOL>row = [str(x) for x in row]<EOL>self.csv_dat...
Make CSV
f11800:c0:m5
def save_ical(self, ical_location):
data = self.cal.to_ical()<EOL>with open(ical_location, '<STR_LIT:w>') as ical_file:<EOL><INDENT>ical_file.write(data.decode('<STR_LIT:utf-8>'))<EOL><DEDENT>
Save the calendar instance to a file
f11800:c0:m6
def save_csv(self, csv_location):
with open(csv_location, '<STR_LIT:w>') as csv_handle:<EOL><INDENT>writer = csv.writer(csv_handle)<EOL>for row in self.csv_data:<EOL><INDENT>writer.writerow(row)<EOL><DEDENT><DEDENT>
Save the csv to a file
f11800:c0:m7
def diff(x, y, x_only=False, y_only=False):
<EOL>if len(x) == <NUM_LIT:0> and len(y) > <NUM_LIT:0>:<EOL><INDENT>return y <EOL><DEDENT>elif len(y) == <NUM_LIT:0> and len(x) > <NUM_LIT:0>:<EOL><INDENT>return x <EOL><DEDENT>elif len(y) == <NUM_LIT:0> and len(x) == <NUM_LIT:0>:<EOL><INDENT>return []<EOL><DEDENT>if isinstance(x, dict):<EOL><INDENT>x = list(x.items(...
Retrieve a unique of list of elements that do not exist in both x and y. Capable of parsing one-dimensional (flat) and two-dimensional (lists of lists) lists. :param x: list #1 :param y: list #2 :param x_only: Return only unique values from x :param y_only: Return only unique values from y :return: list of unique valu...
f11809:m0
def differentiate(x, y):
return diff(x, y)<EOL>
Wrapper function for legacy imports of differentiate.
f11809:m1
def get_version(package_name, version_file='<STR_LIT>'):
filename = os.path.join(os.path.dirname(__file__), package_name, version_file)<EOL>with open(filename, '<STR_LIT:rb>') as fp:<EOL><INDENT>return fp.read().decode('<STR_LIT:utf8>').split('<STR_LIT:=>')[<NUM_LIT:1>].strip("<STR_LIT>")<EOL><DEDENT>
Retrieve the package version from a version file in the package root.
f11812:m0
def clean(self):
if self.config.clean:<EOL><INDENT>logger.info('<STR_LIT>')<EOL>self.execute(self.config.clean)<EOL><DEDENT>
Clean the workspace
f11825:c0:m6
def publish(self):
if self.config.publish:<EOL><INDENT>logger.info('<STR_LIT>')<EOL>self.execute(self.config.publish)<EOL><DEDENT>
Publish the current release to PyPI
f11825:c0:m9
def ansi(color, text):
code = COLOR_CODES[color]<EOL>return '<STR_LIT>'.format(code, text, RESET_TERM)<EOL>
Wrap text in an ansi escape sequence
f11826:m0
def validate(self):
Override this method to implement initial validation
f11827:c0:m1
def check_output(*args, **kwargs):
if hasattr(subprocess, '<STR_LIT>'):<EOL><INDENT>return subprocess.check_output(stderr=subprocess.STDOUT, universal_newlines=True,<EOL>*args, **kwargs)<EOL><DEDENT>else:<EOL><INDENT>process = subprocess.Popen(*args, stdout=subprocess.PIPE, stderr=subprocess.STDOUT,<EOL>universal_newlines=True, **kwargs)<EOL>output, _ =...
Compatibility wrapper for Python 2.6 missin g subprocess.check_output
f11829:m0
def execute(self, command):
execute(command, verbose=self.verbose)<EOL>
Execute a command
f11830:c0:m1
def validate(self, dryrun=False):
raise NotImplementedError<EOL>
Ensure the working dir is a repository and there is no modified files
f11830:c0:m2
def commit(self, message):
raise NotImplementedError<EOL>
Commit all modified files
f11830:c0:m3
def tag(self, name, annotation=None):
raise NotImplementedError<EOL>
Create a tag
f11830:c0:m4
def push(self):
raise NotImplementedError<EOL>
Push changes to remote repository
f11830:c0:m5
def color(code):
return lambda t: '<STR_LIT>'.format(code, t)<EOL>
A simple ANSI color wrapper factory
f11832:m0
def header(text):
print('<STR_LIT:U+0020>'.join((blue('<STR_LIT>'), cyan(text))))<EOL>sys.stdout.flush()<EOL>
Display an header
f11832:m1
def info(text, *args, **kwargs):
text = text.format(*args, **kwargs)<EOL>print('<STR_LIT:U+0020>'.join((purple('<STR_LIT>'), text)))<EOL>sys.stdout.flush()<EOL>
Display informations
f11832:m2
def success(text):
print('<STR_LIT:U+0020>'.join((green('<STR_LIT>'), white(text))))<EOL>sys.stdout.flush()<EOL>
Display a success message
f11832:m3
def error(text):
print(red('<STR_LIT>'.format(text)))<EOL>sys.stdout.flush()<EOL>
Display an error message
f11832:m4
@task<EOL>def clean(ctx):
header(clean.__doc__)<EOL>with ctx.cd(ROOT):<EOL><INDENT>for pattern in CLEAN_PATTERNS:<EOL><INDENT>info(pattern)<EOL>ctx.run('<STR_LIT>'.format('<STR_LIT:U+0020>'.join(CLEAN_PATTERNS)))<EOL><DEDENT><DEDENT>
Cleanup all build artifacts
f11832:m6
@task<EOL>def deps(ctx):
header(deps.__doc__)<EOL>with ctx.cd(ROOT):<EOL><INDENT>ctx.run('<STR_LIT>', pty=True)<EOL><DEDENT>
Install or update development dependencies
f11832:m7
@task<EOL>def cover(ctx, report=False, verbose=False):
header(cover.__doc__)<EOL>cmd = [<EOL>'<STR_LIT>',<EOL>'<STR_LIT>',<EOL>'<STR_LIT>',<EOL>'<STR_LIT>',<EOL>]<EOL>if verbose:<EOL><INDENT>cmd.append('<STR_LIT>')<EOL><DEDENT>if report:<EOL><INDENT>cmd += [<EOL>'<STR_LIT>'.format(ROOT),<EOL>'<STR_LIT>'.format(ROOT),<EOL>'<STR_LIT>'<EOL>]<EOL><DEDENT>with ctx.cd(ROOT):<EOL...
Run tests suite with coverage
f11832:m9
@task<EOL>def qa(ctx):
header(qa.__doc__)<EOL>with ctx.cd(ROOT):<EOL><INDENT>info('<STR_LIT>')<EOL>flake8_results = ctx.run('<STR_LIT>', pty=True, warn=True)<EOL>if flake8_results.failed:<EOL><INDENT>error('<STR_LIT>')<EOL><DEDENT>else:<EOL><INDENT>success('<STR_LIT>')<EOL><DEDENT>info('<STR_LIT>')<EOL>readme_results = ctx.run('<STR_LIT>', p...
Run a quality report
f11832:m10
@task<EOL>def tox(ctx):
header(tox.__doc__)<EOL>ctx.run('<STR_LIT>', pty=True)<EOL>
Run test in all Python versions
f11832:m11
@task<EOL>def doc(ctx):
header(doc.__doc__)<EOL>with ctx.cd(os.path.join(ROOT, '<STR_LIT>')):<EOL><INDENT>ctx.run('<STR_LIT>', pty=True)<EOL><DEDENT>success('<STR_LIT>')<EOL>
Build the documentation
f11832:m12
@task<EOL>def completion(ctx):
header(completion.__doc__)<EOL>with ctx.cd(ROOT):<EOL><INDENT>ctx.run('<STR_LIT>', pty=True)<EOL><DEDENT>success('<STR_LIT>')<EOL>
Generate bash completion script
f11832:m13
@task<EOL>def dist(ctx):
header(dist.__doc__)<EOL>with ctx.cd(ROOT):<EOL><INDENT>ctx.run('<STR_LIT>', pty=True)<EOL><DEDENT>success('<STR_LIT>')<EOL>
Package for distribution
f11832:m14
def rst(filename):
return io.open(filename).read()<EOL>
Load rst file and sanitize it for PyPI. Remove unsupported github tags: - code-block directive - travis ci build badge
f11833:m0
def pip(name):
with io.open(os.path.join('<STR_LIT>', '<STR_LIT>'.format(name))) as f:<EOL><INDENT>return f.readlines()<EOL><DEDENT>
Parse requirements file
f11833:m1
def add_asset(self, asset='<STR_LIT>', amount=<NUM_LIT:0>, timestamp=datetime.utcnow()):
if amount < <NUM_LIT:0>:<EOL><INDENT>raise ValueError('<STR_LIT>'<EOL>'<STR_LIT>'.format(amount))<EOL><DEDENT>if asset not in self.assets:<EOL><INDENT>self.assets[asset] = amount<EOL><DEDENT>else:<EOL><INDENT>self.assets[asset] += amount<EOL><DEDENT>self.history.append({<EOL>'<STR_LIT>': str(timestamp),<EOL>'<STR_LIT>'...
Adds the given amount of an asset to this portfolio. :param asset: the asset to add to the portfolio :param amount: the amount of the asset to add :param timestamp: datetime obj indicating the time the asset was added
f11838:c0:m1
def get_value(self, timestamp=datetime.utcnow(), asset=None):
value = <NUM_LIT:0><EOL>backdated_assets = self.assets.copy()<EOL>for trade in list(reversed(self.history)):<EOL><INDENT>if dateutil.parser.parse(trade['<STR_LIT>']) > timestamp:<EOL><INDENT>backdated_assets[trade['<STR_LIT>']] -= trade['<STR_LIT>']<EOL>if backdated_assets[trade['<STR_LIT>']] == <NUM_LIT:0>:<EOL><INDEN...
Get the value of the portfolio at a given time. :param asset: gets the value of a given asset in this portfolio if specified; if None, returns the portfolio's value :param timestamp: a datetime obj to check the portfolio's value at :returns: the value of the portfolio
f11838:c0:m2
def get_historical_value(<EOL>self,<EOL>start,<EOL>end=datetime.utcnow(),<EOL>freq='<STR_LIT:D>',<EOL>date_format='<STR_LIT>',<EOL>chart=False,<EOL>filename='<STR_LIT>'<EOL>):
date_range = pd.date_range(start, end, freq=freq)<EOL>to_remove = []<EOL>while len(date_range) > <NUM_LIT>:<EOL><INDENT>for index, date in enumerate(date_range):<EOL><INDENT>if index % <NUM_LIT:2> == <NUM_LIT:0> and index != <NUM_LIT:0>:<EOL><INDENT>to_remove.append(date)<EOL><DEDENT><DEDENT>date_range = date_range.dro...
Display a chart of this portfolios value during the specified timeframe. :param start: datetime obj left bound of the time interval :param end: datetime obj right bound of the time interval :param freq: a time frequency within the interval :param date_format: the format of the date/x-axis labels :param chart: whether ...
f11838:c0:m3
def remove_asset(self, asset='<STR_LIT>', amount=<NUM_LIT:0>, timestamp=datetime.utcnow()):
if amount < <NUM_LIT:0>:<EOL><INDENT>raise ValueError('<STR_LIT>'<EOL>'<STR_LIT>'.format(amount))<EOL><DEDENT>if self.get_value(timestamp, asset) < amount:<EOL><INDENT>raise ValueError('<STR_LIT>'<EOL>'<STR_LIT>'.format(amount, self.assets[asset]))<EOL><DEDENT>self.assets[asset] -= amount<EOL>self.history.append({<EOL>...
Removes the given amount of an asset to this portfolio. :param asset: the asset to add to the portfolio :param amount: the amount of the asset to add :param timestamp: datetime obj indicating the time the asset was added
f11838:c0:m4
def trade_asset(<EOL>self,<EOL>amount,<EOL>from_asset,<EOL>to_asset,<EOL>timestamp=datetime.utcnow()<EOL>):
if to_asset == '<STR_LIT>':<EOL><INDENT>price = <NUM_LIT:1>/self.manager.get_price(from_asset, timestamp.date())<EOL><DEDENT>else:<EOL><INDENT>price = self.manager.get_price(to_asset, timestamp.date())<EOL><DEDENT>self.remove_asset(from_asset, amount, timestamp)<EOL>self.add_asset(to_asset, amount * <NUM_LIT:1>/price, ...
Exchanges one asset for another. If it's a backdated trade, the historical exchange rate is used. :param amount: the amount of the asset to trade :param from_asset: the asset you are selling :param to_asset: the asset you are buying :param timestamp: datetime obj indicating the time the asset was traded
f11838:c0:m5
def retrieve_data(self):
<EOL>df = self.manager.get_historic_data(self.start.date(), self.end.date())<EOL>df.replace(<NUM_LIT:0>, np.nan, inplace=True)<EOL>return df<EOL>
Retrives data as a DataFrame.
f11839:c0:m1
def get_min_risk(self, weights, cov_matrix):
def func(weights):<EOL><INDENT>"""<STR_LIT>"""<EOL>return np.matmul(np.matmul(weights.transpose(), cov_matrix), weights)<EOL><DEDENT>def func_deriv(weights):<EOL><INDENT>"""<STR_LIT>"""<EOL>return (<EOL>np.matmul(weights.transpose(), cov_matrix.transpose()) +<EOL>np.matmul(weights.transpose(), cov_matrix)<EOL>)<EOL><DE...
Minimizes the variance of a portfolio.
f11839:c0:m2
def get_max_return(self, weights, returns):
def func(weights):<EOL><INDENT>"""<STR_LIT>"""<EOL>return np.dot(weights, returns.values) * -<NUM_LIT:1><EOL><DEDENT>constraints = ({'<STR_LIT:type>': '<STR_LIT>', '<STR_LIT>': lambda weights: (weights.sum() - <NUM_LIT:1>)})<EOL>solution = self.solve_minimize(func, weights, constraints)<EOL>max_return = solution.fun * ...
Maximizes the returns of a portfolio.
f11839:c0:m3
def efficient_frontier(<EOL>self,<EOL>returns,<EOL>cov_matrix,<EOL>min_return,<EOL>max_return,<EOL>count<EOL>):
columns = [coin for coin in self.SUPPORTED_COINS]<EOL>values = pd.DataFrame(columns=columns)<EOL>weights = [<NUM_LIT:1>/len(self.SUPPORTED_COINS)] * len(self.SUPPORTED_COINS)<EOL>def func(weights):<EOL><INDENT>"""<STR_LIT>"""<EOL>return np.matmul(np.matmul(weights.transpose(), cov_matrix), weights)<EOL><DEDENT>def func...
Returns a DataFrame of efficient portfolio allocations for `count` risk indices.
f11839:c0:m4
def solve_minimize(<EOL>self,<EOL>func,<EOL>weights,<EOL>constraints,<EOL>lower_bound=<NUM_LIT:0.0>,<EOL>upper_bound=<NUM_LIT:1.0>,<EOL>func_deriv=False<EOL>):
bounds = ((lower_bound, upper_bound), ) * len(self.SUPPORTED_COINS)<EOL>return minimize(<EOL>fun=func, x0=weights, jac=func_deriv, bounds=bounds,<EOL>constraints=constraints, method='<STR_LIT>', options={'<STR_LIT>': False}<EOL>)<EOL>
Returns the solution to a minimization problem.
f11839:c0:m5
def allocate(self):
df = self.manager.get_historic_data()[self.SUPPORTED_COINS]<EOL>change_columns = []<EOL>for column in df:<EOL><INDENT>if column in self.SUPPORTED_COINS:<EOL><INDENT>change_column = '<STR_LIT>'.format(column)<EOL>values = pd.Series(<EOL>(df[column].shift(-<NUM_LIT:1>) - df[column]) /<EOL>-df[column].shift(-<NUM_LIT:1>)<...
Returns an efficient portfolio allocation for the given risk index.
f11839:c0:m6
@property<EOL><INDENT>def base_point(self):<DEDENT>
return JacobianPoint(self, self.Gx, self.Gy)<EOL>
Returns the base point for this curve. Returns: JacobianPoint: The base point.
f11841:c0:m1
def inverse(self, N):
if N == <NUM_LIT:0>:<EOL><INDENT>return <NUM_LIT:0><EOL><DEDENT>lm, hm = <NUM_LIT:1>, <NUM_LIT:0><EOL>low, high = N % self.P, self.P<EOL>while low > <NUM_LIT:1>:<EOL><INDENT>r = high//low<EOL>nm, new = hm - lm * r, high - low * r<EOL>lm, low, hm, high = nm, new, lm, low<EOL><DEDENT>return lm % self.P<EOL>
Returns the modular inverse of an integer with respect to the field characteristic, P. Use the Extended Euclidean Algorithm: https://en.wikipedia.org/wiki/Extended_Euclidean_algorithm
f11841:c0:m2
def is_on_curve(self, point):
X, Y = point.X, point.Y<EOL>return (<EOL>pow(Y, <NUM_LIT:2>, self.P) - pow(X, <NUM_LIT:3>, self.P) - self.a * X - self.b<EOL>) % self.P == <NUM_LIT:0><EOL>
Checks whether a point is on the curve. Args: point (AffinePoint): Point to be checked. Returns: bool: True if point is on the curve, False otherwise.
f11841:c0:m3
def generate_private_key(self):
random_string = base64.b64encode(os.urandom(<NUM_LIT>)).decode('<STR_LIT:utf-8>')<EOL>binary_data = bytes(random_string, '<STR_LIT:utf-8>')<EOL>hash_object = hashlib.sha256(binary_data)<EOL>message_digest_bin = hash_object.digest()<EOL>message_digest_hex = binascii.hexlify(message_digest_bin)<EOL>return message_digest_...
Generates a private key based on the password. SHA-256 is a member of the SHA-2 cryptographic hash functions designed by the NSA. SHA stands for Secure Hash Algorithm. The password is converted to bytes and hashed with SHA-256. The binary output is converted to a hex representation. Args: data (str): The data to ...
f11841:c1:m1
def generate_public_key(self):
private_key = int(self.private_key, <NUM_LIT:16>)<EOL>if private_key >= self.N:<EOL><INDENT>raise Exception('<STR_LIT>')<EOL><DEDENT>G = JacobianPoint(self.Gx, self.Gy, <NUM_LIT:1>)<EOL>public_key = G * private_key<EOL>x_hex = '<STR_LIT>'.format(public_key.X, <NUM_LIT:64>)<EOL>y_hex = '<STR_LIT>'.format(public_key.Y, <...
Generates a public key from the hex-encoded private key using elliptic curve cryptography. The private key is multiplied by a predetermined point on the elliptic curve called the generator point, G, resulting in the corresponding private key. The generator point is always the same for all Bitcoin users. Jacobian coord...
f11841:c1:m2
def generate_address(self):
binary_pubkey = binascii.unhexlify(self.public_key)<EOL>binary_digest_sha256 = hashlib.sha256(binary_pubkey).digest()<EOL>binary_digest_ripemd160 = hashlib.new('<STR_LIT>', binary_digest_sha256).digest()<EOL>binary_version_byte = bytes([<NUM_LIT:0>])<EOL>binary_with_version_key = binary_version_byte + binary_digest_rip...
Creates a Bitcoin address from the public key. Details of the steps for creating the address are outlined in this link: https://en.bitcoin.it/wiki/Technical_background_of_version_1_Bitcoin_addresses The last step is Base58Check encoding, which is similar to Base64 encoding but slightly different to create a more huma...
f11841:c1:m3
def double(self):
X1, Y1, Z1 = self.X, self.Y, self.Z<EOL>if Y1 == <NUM_LIT:0>:<EOL><INDENT>return POINT_AT_INFINITY<EOL><DEDENT>S = (<NUM_LIT:4> * X1 * Y1 ** <NUM_LIT:2>) % self.P<EOL>M = (<NUM_LIT:3> * X1 ** <NUM_LIT:2> + self.a * Z1 ** <NUM_LIT:4>) % self.P<EOL>X3 = (M ** <NUM_LIT:2> - <NUM_LIT:2> * S) % self.P<EOL>Y3 = (M * (S - X3)...
Doubles this point. Returns: JacobianPoint: The point corresponding to `2 * self`.
f11841:c2:m5