Program Coordinator, FDA and Third Party Tobacco Retail Inspection Program - JBS International, Inc. - Maryland   
Experience in conducting data analysis, preferably using SQL server databases. A Bachelor's degree, in business administration, public administration, criminal...
From JBS International, Inc. - Fri, 02 Jun 2017 03:23:28 GMT - View all Maryland jobs
          Q2 2017 Update of World Development Indicators Available   

The World Development Indicators database has been updated. This is a regular quarterly update to over 600 indicators and includes both new indicators and updates to existing indicators.

2016 data for population, GDP and GNI-related indicators have been released for countries and aggregates. Other data that have been updated include: balance of payments series, monetary indicators, military expenditure, and merchandise trade. The classifications of countries by income, and aggregations by income group reflect new fiscal year 2018 income classifications.

New Public Private Partnership series have been introduced in this release. The percentage of people with an account (SDG 8.10.2 from the Findex) is also available and disaggregated by sex, income, and education level.

Purchasing Power Parities have been updated for OECD and Eurostat countries to reflect their latest release. Purchasing Power Parities and related indicators in PPP terms for Cuba (expenditures, income, etc.) have been removed.

Data can be accessed via various means including:

- The World Bank’s multi-lingual, mobile-friendly data site, http://data.worldbank.org  
- The DataBank query tool: http://databank.worldbank.org 
- Bulk download in XLS and CSV formats and directly from the API


          Senior SQL Integration/ETL Consultant (CTH)   
PA-Collegeville, AETEA Information Technology has an immediate need for a Senior Database Consultant with expertise in SQL Server. This is a contract-to-hire role and the client is only considering candidates that can work on a w2 basis. LOCATION: Remote Accepting Candidates Aetea has been asked by their valued client, a leading healthcare company, to assist with their need for a Sr. ETL Developer. The client and
          Senior Oracle Database Administrator - Wesco Aircraft - Austin, TX   
7+ years’ experience using UNIX and or Linux Red Hat in Oracle environment5+ years' experience in Oracle RAC, ASM, and Physical Standby/Active Data Guard...
From Wesco Aircraft - Mon, 12 Jun 2017 20:10:00 GMT - View all Austin, TX jobs
          Drupal core: Sort is lost when using views exposed filter   

Problem/Motivation

Original problem report, as amended by @mpp:

  • Create a paged view with an exposed block and expose the "items per page" option to allow a user to change the amount of results.
  • When applying a sort on the view and then changing the amount of items per page, the sort is lost.

The original report (from 30 Oct 2016). There were a couple of attempts to recreate the problem. It is unclear that anyone was able to recreate the same problem. Comment: #2823541-10: Sort is lost when using views exposed filter is able to recreate a problem, but not necessarily the same issue.

A new issue was created #2887144: Views exposed form block options are not updated immediately when adding additional sorts, filters, etc (Caching?).

On June 25, 2017 we tried to recreate the problem on Drupal 8.2.1 and 8.2.6 (to see if we could recreate it on the version likely being used by the original report). We were unable to recreate the problem using the steps outlined in comments:

#2823541-9: Sort is lost when using views exposed filter
#2823541-10: Sort is lost when using views exposed filter

Proposed resolution

Atm the views exposed form filters out get parameters in ViewsExposedForm::buildForm:

$form['#action'] = $view->hasUrl() ? $view->getUrl()->toStrin- g() : Url::fromRoute('<current>')->toString();

A quick workaround would be to alter the exposed block form to add current sort & order query parameters.

Remaining tasks

User interface changes

(New or changed features/functionality in the user interface, modules added or removed, changes to URL paths, changes to user interface text.)

API changes

(API changes/additions that would affect module, install profile, and theme developers, including examples of before/after code if appropriate.)

Data model changes

(Database or configuration data changes that would make stored data on an existing site incompatible with the site's updated codebase, including changes to hook_schema(), configuration schema or keys, or the expected format of stored data, etc.)

Original report by [username]

(Text of the original report, for legacy issues whose initial post was not the issue summary. Use rarely.)


          UC Wish List: Drupal 8 Roadmap   

Goals and Implementation overview

Listed below are the goals and deliverables for the Google Summer of Code 2017 project Porting UC wishlist to Drupal 8:

Phase 1:

  1. Add administrator wish list settings:
    • The feature makes use of the Form API for the creation and implementation of the above functionalities. The Form function has to be refactored to the FormStateInterface function.
    • The Configuration API provides a central location to store configuration details of the above components, particularly Drupal::config()→get() & Drupal::config()→set(), to store the involved variables.
    • The global theme variables have to be replaced by the active theme object.
  2. Enable ‘Add to Cart’ and ‘Add to wishlist
    • Enables the user to add items (products) either to the cart or to a particular wishlist based upon requirement. Also, this feature enables to add particular items to the cart present in specific wishlists.
    • This feature would be implemented using the Hook functions in Drupal. Considering adding items to cart, the function uc_cart_add_item() would be used. This button would be accessible on the user’s wish list only if product is not out of stock. To check that we would use uc_stock module uc_product_exist.
    • For adding items to a wishlist, we would use the Hook function uc_wishlist_add_to_wishlist_submit and delivery to orders will add item to wishlist and redirect the page.
    • Implement hook_form_validation to add validation to the uc_cart_view_form.

Phase 2:

  1. Allowing a user to view/update wishlist:
    • This functionality enables the user to view the contents of a specific wish list, modify it according to the options available and add to cart the required items.
    • The callback: uc_wishlist_user_display is implemented for user authorization
    • The callback: uc_wishlist_display($wid, $mode) is utilised under the function uc_wishlist_view_form to check for required permissions and private checkboxes.
    • The Theme API is used to arrange the created forms in the resulting page. The Renderable Array System has to be applied to improve the grouping of items.
  2. Option to email wishlist to others:’ features:
    • This feature would enable users or wish list owners to share any wish list with other users or potential customers by emailing it to the respective user.
    • The Email subject, Email recipients & Email message properties are defined in an appropriate hook_mail implementation.
    • The Form API and Database API would be used for the creation, submission & validation of the wish list email and to provide a structured interface for the dynamic construction of queries to implement the above functionality.
    • Conversion of D7 Hook functions to D8 APIs involving Routes, to define the path to controllers (page callbacks in D7) or to create tabs and/or contextual links. Email address of an user is verified through the Data Common API.
    • The Renderable Array System has to be applied to improve the grouping of form items. The email could be sent by using Mail Manager using public function MailManager::mail.

Phase 3:

  1. Enable 'Search Wishlist' functionality:
    • Allows user to search a specific wish list with a purpose to access/modify it.
    • This functionality would make use of the Form and Search APIs for executing a functional and effective search.
    1. Add User wish list settings:
      • This functionality specifically allows the user to implement certain modifications for any specific wishlist.
      • This feature is implemented by creating a table for the required wish list settings using the Form and Database APIs.
      • The mandatory fields, including User, Title, Expiration Date & Status are created using Field API and the Hook functions hook_field_settings_form & hook_field_widget_form. The Configuration API would be used to configure the fields.
      • db_select is implemented with a Join to join/combine with another table containing relevant information.
      • The Renderable Array system would be applied to improve the grouping of the contents. The Field Theme System would be configured.

    Documentation

    References:


              Favalias 0.9.2 available !   

    Summary of changes in version 0.9.2 :
    - You can save/load your favorites and aliases on my web server (jcmag.europe.webmatrixhosting.net, in a Sql Server database). So you can retrieve them everywhere.
    I put the default files on the server, you can retrieve them using username/pwd : favalias/favalias (! it will overwrite your current settings !)
    - You can type a command even if it is not an alias, for example "iisreset". So Favalias behaves like Windows "Run" window.
    - Changed the "cmd" alias to : "run cmd.exe /K date /t & time /t & if exist "%VS71COMNTOOLS%\vsvars32.bat" ("%VS71COMNTOOLS%\vsvars32.bat") else (echo VS.NET not installed) &" ; so you can use the .NET tools if VS is installed and you can create aliases to launch commands like :
    "ip" alias to display your IP addresses -> "cmd ipconfig /all"


              Link Building Secrets   

    The ultimate goal for your website is to reach the top position in Google. Utilizing link
    building
    is a basic but highly effective method in accomplishing your goal. These are some of the best tools and methods to use and increase your site's popularity with the search engines which leads to more visitors who then become buyers.



    Learn to write creative and useful articles for your niche that add value for the reader. Posting your articles to article directories can lead to many back links to your site. As webmasters use your articles on their sites, the potential for even more traffic to your site grows rapidly. Searching for 'article marketing' will yield many sites and software to help produce and submit your articles.



    Building a blog for your niche and linking it back to your main page is a great way to build links and increase traffic to your site. Use some of the articles you write to post on your blog. By linking back to your main site, you’ll get inbound links. Do the same type oflink building by creating Squidoo lenses and Hubpages.



    Bookmarks are great link building tools. When you bookmark your articles, they are linked back to your site. Popular bookmark sites include digg.com, del.icio.us, and technorati.com. The Social Bookmarking Tool has approximately 50 such bookmarking sites to help create more inbound links.



    An easy method to obtain natural quality links is to submit your site to web directories. By listing your site with a large amount of directories, you will receive a higher amount of inbound links. This can be tedious by hand, so search for 'directory submission software' or 'directory submission tool.' They will allow you to submit your site to many directories in their database at once that are related to your niche. This is done automatically which really saves time. Many online companies provide fast and free directory submission software to aid in your link building.



    Posting to forums is a wonderful way toget back links to your site. Find forums throughout the web that are related to your site or business and join the discussion. Always include a signature with your site link at the end of every post and you are on your way to building link popularity.



    These are five of the most effective methods tobuilding links back to your site. Using one or a combination of these tools are a great way to increase your site’s popularity. Combine these tools with a little perseverence and patience and you’re on your way to a higher search engine ranking through link building.

     


    Building
    links
    can be tedious work by hand. Get backlinks
    the smart way - use automation!


              SUPERVISOR, MEDICAL CODING - Business Services Admin - Full Time - Day   
    Posted on: 2017-07-01

    Performs a variety of medical specialty based charge capture and/or coding functions (entering charges, reviewing and resolving coding edits, adding/removing modifiers, validating medical necessity, appropriately selecting the correct codes (ICD9/ICD10, CPT/HCPCS add modifiers), ensuring completeness of clinical documentation supports medical billing, performs partial/full record abstraction of charges and other medical coding or charge capture related duties. Reviews, processes and posts transaction data from patient accounts. Gathers, classifies, tabulates and proofreads financial data. Performs arithmetic calculations. Scans and electronically files documents. Checks items on reports, summarizing and posting the data to designated accounts, performs a variety of other fiscal office duties. Performs regular coding monitoring and educational reviews for providers and coding team to support meeting billing requirements for governmental and commercial payers. Provides timely and accurate updates to key stakeholders relative to coding updates and performance to maintain and/or improve reimbursement. Responsible for assisting with charge reconciliation and charge corrections. Also responsible for reviewing providers who are on continued coding holds (aka Concurrent Coding Reviews). Responsible for assisting with the communication of annual coding updates to providers. Responsible for evaluating patient related coding disputes. Oversees and performs quality audits to assure consistency, accuracy and standardization of procedures and optimizes medical coding. Ensures compliance with policies/procedures and standards coding. Actively involved in improvement efforts, workflow design and validation with input on policies and procedures. Supervises departmental resources effectively. Performs quality audits, providing retraining or action plans as needed to improve accuracy and meet production/patient satisfaction targets. Generates and reviews reports to track performance outcomes, performs root cause analysis to identify performance improvement needs. Works with leadership on process improvement, tracks all efforts and outcomes. Monitors reports to identify additional process improvement opportunities. Assigns work based on staffing levels and work load to optimize productivity and meet department standards. Communicates with customers including clinical and non-clinical staff, physicians and leadership to manage department operations. Provides timely feedback to staff on job performance, improvement or corrective actions as needed. Provides input on and may conduct staff annual performance evaluations. Responsible for keeping abreast of current policies, practices and procedures and provide guidance to staff. Assists in urgent situations requiring immediate decision making. Responsible for assisting/preparing staff schedules and managing coverage arrangements to ensure excellent patient care.

    Required Education/Experience/Specialized Skills: Requires 4 years coding or charge capture experience in ambulatory or professional fee billing with 2 year experience in utilization of spreadsheets, graphics, power point, analytics and database applications with focus on the performance of charge quality assessments. 2 or more years of supervisor or leadership experience strongly preferred. Requires strong working knowledge of commercial and governmental payor policies. Working experience in the use of medical terminology and Medicare regulatory requirements for coding, billing and reimbursement is required. Familiarity with HIPAA privacy requirements for patient information. Basic understanding of medical ICD9/ICD10 codes, CPT/HCPC codes and modifiers. Ability to multitask, meet deadlines and stay organized. Must have excellent verbal and written communication skills and customer service skills. Must be detail oriented with the ability to prioritize work. Requires a moderate level of interpersonal and problem solving skills. Knowledgeable on medical coding and billing specific to insurance and reimbursement processes. Must demonstrate the ability to establish/maintain cooperative working relationships with staff, operations and providers. Proficient in preparation and presentation of summary information to provide clear and concise coding monitoring and quality updates to focused groups and finance leadership.

    Required Certification/Registration: Requires certification from either Registered Health Information Technician (RHIT); Certified Coding Associate (CCA) or Certified Coding Specialist (CCS) from American Health Information Management Association (AHIMA) or a comparable certification from another accredited coding organization.(i.e. Certified Professional Coder from the American Academy of Professional Coders).

     

     Scripps Health is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, sex, age, status as a protected veteran, among other things, or status as a qualified individual with disability.



    Job: Access / Business Services

    Primary Location : Central San Diego County-LA JOLLA-SCRIPPS MED FOUNDATION ADMINISTRATION

    Organization : 10010 CAMPUS POINT DRIVE

    Job Posting :

    Benefit status :


              PHYSICIAN RESOURCES COORDINATOR - SCMC Administration (Campus Point) - Full Time - Day   
    Posted on: 2017-07-01

    This position provides support and assists with all activities related to Physician Resources, physician and physician extender planning, recruiting, credentialing, orientation, special projects and professional standards review. 
    Required Experience/Education/Specialized Skills: Two years experience in areas of physician recruitment/staffing, credentialing, human resource/benefits, medical staff office, medical group administrative office. Strong organizational focus.  Knowledge of clinic operations. Requires flexibility in dealing with multiple assignments and work in a fast-paced environment, handling multiple tasks and priorities simultaneously. Ability to interact professionally and effectively with and gain confidence of physicians and other practitioners. Excellent interpersonal, communication, decision-making, and organizational skills. Strong writing skills needed to help develop letters, reports, etc.  Ability to work both independently as well as part of a team. Solid intermediate or higher computer skill level including the creation of spreadsheets, database information input and maintenance. Established proficiency with multiple software programs which may include but are not limited to Microsoft Office products (Word, Excel, and PowerPoint).
     
    Preferred Experience/Education/Specialized Skills:  Bachelors degree is preferred (appropriate experience can be substituted for education). 

    Scripps Health is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, sex, age, status as a protected veteran, among other things, or status as a qualified individual with disability.

    Job: Access / Business Services

    Primary Location : Central San Diego County-SAN DIEGO-SCRIPPS COASTAL MEDICAL CENTER-HILLCREST

    Organization : 501 WASHINGTON ST STE 600

    Job Posting :

    Benefit status :


              Comment on Contact by HK   
    Just wanted to say that this site is fantastic! I love the player GIF database you guys have going; makes it so much easier for me to casually analyze a player since I don't have Rewind or anything. Keep up the great job...really looking forward to what you guys have for the draft. This would be opening up a can of worms, but I don't suppose you'd take player GIF requests would you? :P
              What Software Architects Can Learn From Baseball Teams   

    Originally posted on: http://brustblog.net/archive/2013/05/29/what-software-architects-can-learn-from-baseball-teams.aspx

    WhiteSox20130513_81_1

    My friend Larry Calrkin did a whole series on Architecture by Baseball, but after going to a recent White Sox game I got to thinking about the how baseball mirrors my experience.  For me it boils down to specialization, team work and leadership.

    Every team member has their specialty.  Infielders have great reactions and throwing accuracy.  Outfielders can cover distance quickly and throw long distances.  There are starting pitchers who have great control and endurance and closers who throw nasty pitches for a short time.  Likewise, there are specialized positions on a development team.  There are UI developers who improve the user experience.  Your have performance experts who can find every potential roadblock in a a piece of code.  Then there are security specialists, database gurus and product experts.  Each specialist has something to add to the quality of the final product.  As an architect you need to give each of these specialists room to do what they do best.

    This then leads to team work.  You can have a team of great specialists, but if they don’t come together as a team it can mean failure.  If baseball players don’t communicate you end up with errors that can gift wrap runs for the opposing team.  The problem with a software team is that the opposing teams are called missed requirements, missed deadlines and poor quality software.  As an architect we can help identify which of our specialists are best to attack a particular problem and also help the entire team understand how their tasks fit into accomplishing the project.  This helps to bring the team’s focus together instead of working as individuals.

    Who brings all this together?  The manager is the leader on the field.  I look at this position being where the architect works.  Yes there is a project manager, but I think of them as the general manager who clear the way so the manager can get the goals of the team accomplished.  The architect should be the person who brings the team together and gives it direction.  I also believe it is part of my job as an architect to help the developers on my projects to improve their code quality and the way they interact with stakeholders.  Leading also means being willing to build proof of concepts or even taking on coding components so that your team members understand that you really practice what you preach.

    Ultimately you need know as much as you can about every position, leverage your specialist like pitching and hitting coaches to fill in missing information, bring your players together to win and be a leader on the field.  Good luck.


              IBM AIX Workstation 7011-250 (Bull DPX-20 / 100)   

    Linux acum pe toate drumurile, și toți se laudă că știu să administreze un server cu Linux (chiar dacă-s p’afariști).
    Așa că nu mai este impresionant la resume’ - și nici la salariu.
    Administratorii de UNIX-uri comerciale însă sunt la căutare. Cu banu’ gros.

    Dar aveți noroc - cine vrea să învețe să administreze un UNIX comercial - AIX - poate să îl ia de la mine pe Costel.

    Costel este un IBM 7011-250 (re-badge-uit ca Bull DPX-20 / 100, produs “European” pentru a putea fi luat cu fonduri EU)
    cu:
    - 1 CPU PowerPC 601 66Mhz
    - 96 MB Ram (4*16 + 4*8)
    - 2.88 MB Floppy drive
    - HDD SCSI 9Gb
    - 2 plăci rețea 10Mbps cu transcievere RJ45 (una on-board, una pe slot MCA)
    - Placă MCA multi-serial (8 porturi) la care nu am însă cablul.
    - Făra placă video - consola pe port serial, dau și cablu serial.
    - AIX 5.1L instalat. Ultima versiune care mai suportă PowerPC601
    - Cheie originală front-panel.

    Extra:

    - CD-uri: instalare AIX 4.3.3, instalare AIX 5.1, copie la arhiva de freeware pt AIX 4.3.3 de la bulfreeware.com
    - CDROM SCSI extern UltraPlex 40max (unul din puținele modele cu sector-size selectabil - de pe care pot boota Mac-uri vechi și SGI-uri)
    - Unitate bandă SCSI VXA-1 in cutie externă SCSI Bull
    - HDD SCSI 15Krpm 36Gb în cutie externă SCSI Sun
    - Toate cablurile SCSI necesare
    - Bandă date 8mm VXA V17 + bandă curățare 8mm VXA + bandă date 8mm Sony modificată pt compatibiliate VXA + 12 alte benzi de 8mm gata de a fi modificate de voi.

    Opționale:

    - Diverse adaptoare port serial
    - Switch 10/100
    - “Skateboard” - o placă de lemn cu rotile și prelungitor pe care o foloseam ca să-l pot băga și scoate ușor dintr-o debara (cu toate cutiile SCSI pornite nu este foarte silențios)

    Ce-i lipsește:

    - Un stăpân iubitor. La mine stătea în debara și colecta praf.
    - Un adaptor USB-Serial pt consolă - dacă vreți să accesați consola de pe mac

    Cât costă ?

    100 lei.

    De unde îl iau ?

    Bucureșți, sector.2, lângă stadionul național.
    NU îl trimit cu curier - este făcut din oțel gros, este GREU.

    Mai multe informații:

    (fiind mașină pre-CHRP, prtconf nu știe să citească model, frecvența și versiunea de firmware)
    (hdd-ul extern este nefolosit, de aia nu apare la df)

    costel:~# prtconf
    lsattr0514-528 The "frequency" attribute does not exist in the predefined
            device configuration database
    .
    lsattr0514-528 The "modelname" attribute does not exist in the predefined
            device configuration database
    .
    lsattr0514-528 The "fwversion" attribute does not exist in the predefined
            device configuration database
    .
    System Model:
    Machine Serial NumberNot Available
    Processor Type
    PowerPC_601
    Number Of Processors
    1
    Processor Clock Speed
    :  MHz
    CPU Type
    32-bit
    Kernel Type
    32-bit
    LPAR Info
    : -1 NULL
    Memory Size
    128 MB
    Good Memory Size
    :  MB
    Firmware Version
    :
    Console Loginenable
    Auto Restart
    false
    Full Core
    false

    Network Information
            Host Name
    costel
            IP Address
    192.168.x.x
            Sub Netmask
    255.255.255.0
            Gateway
    192.168.x.x
            Name Server
    192.168.x.x
            Domain Name
    xx.xx

    Paging Space Information
            Total Paging Space
    192MB
            Percent Used
    7%

    Volume Groups Information
    ==============================================================================
    rootvg:
    PV_NAME           PV STATE          TOTAL PPs   FREE PPs    FREE DISTRIBUTION
    hdisk0            active            542         143         34..00
    ..00..00..109
    ==============================================================================

    INSTALLED RESOURCE LIST

    The following resources are installed on the machine.
    +/- = 
    Added or deleted from Resource List.
    *   = 
    Diagnostic support not available.

      
    Model Architecturers6k
      Model Implementation
    Uni-ProcessorMCA bus

    sys0              00-00             System Object
    sysplanar0        00-00             System Planar
    ioplanar0         00-00             I/O Planar
    bus0              00-00             Microchannel Bus
    sio0              00-00             Standard I/O Planar
    fda0              00-00-0D          Standard I/O Diskette Adapter
    fd0               00-00-0D-00       Diskette Drive
    ent0              00-00-0E          Integrated Ethernet Adapter
    sioka0            00-00-0K          Keyboard Adapter
    sioma0            00-00-0M          Mouse Adapter
    ppa0              00-00-0P          Standard I/O Parallel Port Adapter
    scsi0             00-00-0S          Standard SCSI I/O Controller
    hdisk0            00-00-0S-0,0      Other SCSI Disk Drive
    cd0               00-00-0S-1,0      Other SCSI CD-ROM Drive
    hdisk1            00-00-0S-4,0      Other SCSI Disk Drive
    rmt0              00-00-0S-6,0      Other SCSI Tape Drive
    siota0            00-00-0T          Tablet Adapter
    sa0               00-00-S1          Standard I/O Serial Port 1
    tty0              00-00-S1-00       Asynchronous Terminal
    sa1               00-00-S2          Standard I/O Serial Port 2
    tty1              00-00-S2-00       Asynchronous Terminal
    ent1              00-01             Ethernet High-Performance LAN
                                          Adapter 
    (8ef5)
    sa2               00-02             8-Port Asynchronous Adapter EIA-232
    proc0             00-00             Processor
    mem0              00-0A             16 MB Memory SIMM
    mem1              00-0B             16 MB Memory SIMM
    mem2              00-0C             16 MB Memory SIMM
    mem3              00-0D             16 MB Memory SIMM
    mem4              00-0E             8 MB Memory SIMM
    mem5              00-0F             8 MB Memory SIMM
    mem6              00-0G             8 MB Memory SIMM
    mem7              00-0H             8 MB Memory SIMM
    sysunit0          00-00             System Unit

    costel
    :~# oslevel -r
    5100-09

    costel
    :~# df -Pk
    Filesystem    1024-blocks      Used Available Capacity Mounted on
    /dev/hd4            32768     13568     19200      42% /
    /
    dev/hd2          2048000   1445004    602996      71% /usr
    /dev/hd9var         32768     22152     10616      68% /var
    /
    dev/hd3            32768      1988     30780       7% /tmp
    /dev/hd1            32768      1140     31628       4% /home
    /dev/lv00         4096000   3187444    908556      78% /data
    /proc                   -         -         -       -  /proc
    /dev/hd10opt        32768      9552     23216      30% /opt 

    Aștept oferte sau întrebări.


              IN THE ROOM with Bobby Fish (2/29/2012)   
    This week, massive database problems plague TBH.com, forcing Brady Hicks to stream a LIVE one hour version of IN THE ROOM exclusively on WEXP 1600 AM in Philadelphia. Fortunately, you are reading this which means the database issue has been resolved and life on TBH.com can carry on as usual. Check it out as independent standout – and international star – Bobby Fish stops by to talk about his entry in the 2012 ECWA Super 8 tournament, plus Pro Wrestling Illustrated’s Brady Hicks – with Derrick McDonald – break down all that they know about this year’s WrestleMania so far. Be sure to check out ECWA Super 8, Saturday, April 7 in Newark, Delaware, and thanks, as always, for the support!
              FAQ   
    Akismet checks your comments and contact form submissions against our global database of spam to protect you and your site from malicious content.
              This Popular Island Destination Is Exporting Monkeys for Cruel Experiments   
    Mauritius is one of the world's largest suppliers of non-human primates for inhumane medical experiments.

    Mauritius, a small island in the Indian Ocean, is a dream holiday destination for tourists from all over the world. It is famous for its beautiful beaches, lagoons, tropical climate, heritage sites, lush forests and wildlife. Yet, this idyllic location is also infamous for a sinister reason—the cruel exploitation of its population of monkeys. Mauritius is one of the world’s largest participants in the cruel trade of supplying non-human primates for experiments. In 2016, 8,245 long-tailed macaques were exported from Mauritius to the USA, Canada and Europe with 3,522 imported by the USA, the largest importer of monkeys from Mauritius.

    In Mauritius, the long-tailed macaque (Macaca fascicularis) lives freely. However, the species is not considered indigenous, despite having been well-established on the island for about 400 years. Although the species is listed on Appendix II of the Convention on the International Trade in Endangered Species of Wild Fauna and Flora (CITES), there exists no legislation to protect the primates of Mauritius. Instead, they are widely persecuted and exploited.

    Historically, monkeys were trapped in the wild to be shipped overseas. Following international condemnation of the trade in wild-caught primates, tens of thousands of primates are now held in farms across Mauritius. Many of these animals were captured from the wild and are now imprisoned in these farms and used for breeding. Denied their freedom in the lush foliage of their jungle homes, these individuals spend their lives behind bars, on concrete. Their offspring are transported as ‘cargo’ in small wooden crates on airplanes to laboratories around the world to feed the international research industry.

    Tourism is a key pillar of the economy of Mauritius and contributes significantly to the economic growth of the island. Mauritius is also promoting the island’s image as a green, eco-friendly tourist destination. The reputation of Mauritius as a country where the environment is valued is being put at risk by the export of monkeys for cruel experiments. Added to this is the introduction of recent regulations that will, for the first time, allow such experiments to be carried out on the island itself. The main species to be used in the research will be the country’s population of long-tailed macaques.

    It cannot be argued that the economic benefits of the monkey trade and potential revenue from experiments are more important than tourism. Even a brief glance at the figures shows this controversial trade, worth less than 2 percent of Mauritian export, is economically insignificant compared with the income that Mauritius receives from its tourism industry. It is well-established that if a country develops a reputation for unkind treatment of animals, it has a very strong negative effect on tourism.

    An additional factor to consider which is equally puzzling is that Hinduism is the largest religion in Mauritius. The country has the third highest percentage of Hindus in the world after Nepal and India. Lord Hanuman, the monkey god, is one of the most popular idols in the Hindu religion and is worshipped as a symbol of physical strength, perseverance and devotion. The trade in primates on Mauritius clearly is contrary to the very concept of Hindu culture and society which emphasises the spiritual equality of all living beings.

    There are concerns that the introduction of animal experiments to Mauritius is primarily to provide a new market for the primate breeding companies and a reaction to problems with airlines refusing to transport primates for research purposes, moves to impose tighter restrictions on the import of primates within the European Union and a growing public concern about the use of primates in research. Animal researchers and companies may be looking to travel to Mauritius to carry out research that would not be allowed to take place in their own country.

    A glance at the new regulations governing the experiments shows that substantial sections have simply been taken from EU and UK legislation, but this has not been consistently done, so there are significant gaps and contradictions. For example, there is no provision for governmental inspections of laboratories. Nor are there any rules in the regulations about the housing, environment and enrichment to be provided to animals. Furthermore, transparency and accountability appear to be absent because, although there is a requirement for researchers to submit records to the government, there is no provision for the government to subsequently put such information into the public domain. 

    The long-tailed macaque is the most widely traded primate species for research worldwide and the most widely-traded mammal on the CITES database. In the laboratory, these primates may suffer substantially, including the effects of poisoning (such as vomiting, internal bleeding, weight loss, organ failure and even death) after being forced to consume large quantities of chemicals or drugs in toxicity tests or face being subjected to major brain surgery, their skulls cut open and devices implanted into their brains.

    Examples of recent research carried out on long-tailed macaques in the USA makes disturbing reading: 1) experiments that have attempted to mimic traumatic military injuries; 2) forced addiction to recreational drugs such as alcohol and cocaine; 3) injections with phencyclidine (PCP or ‘angel dust’) and 4) forced inhaling of cigarette smoke several hours a day (for some monkeys it was the equivalent of a person smoking four packs of cigarettes a day).

    The development of alternative methods to using animals is a growing and pioneering field. There is now a wide range of more human-relevant and humane approaches and animal tests are being replaced in areas such as toxicity testing, neuroscience and drug development. These alternatives include cell, tissue and organ cultures; methods using chemistry, computers or imaging machines; and ethical and highly effective studies using human volunteers.

    Cruelty Free International is dedicated to ending this cruel exploitation of the Mauritius monkeys. We believe that the focus for Mauritius should instead be on these new technologies for non-animal experiments and we are urging Mauritius to become a forward-thinking country that adopts humane and cutting-edge alternatives. Mauritius’ image abroad is already tarnished because of its role in the cruel international trade in monkeys for research. Allowing animal experiments to take place will have a further negative impact and likely result in further widespread protest.

    Our campaign has received widespread support from around the world, including in Mauritius, by scientists, wildlife experts, politicians and socio-cultural groups as well as members of the public. Indian politician Maneka Gandhi and internationally renowned primatologist Dr. Jane Goodall have also voiced their concerns.

    There are three actions you can take to support our campaign to protect the monkeys of Mauritius and let government officials know that what they are doing is unacceptable:

    1. Send an email/letter to the Mauritius Embassy in Washington:

    mauritius.embassy@verizon.net

    washingtonemb@govmu.org

    H. E. Mr S. Phokeer

    Ambassador Extraordinary and Plenipotentiary

    Mauritius Embassy

    1709 N. Street, NW

    Washington D.C. 20036

    2. Send an email/letter to the Minister of Tourism in Mauritius:

    mtou@govmu.org

    The Hon Anil Kumarsingh GAYAN, SC

    Minister of Tourism

    Ministry of Tourism

    Level 5, Air Mauritius Centre

    John Kennedy Street

    Port Louis

    Mauritius

    3. Sign this petition.

     

    Related Stories


              Ratings Changes Today   
    TheStreet Quant Ratings provides fair and objective information to help you make educated investing decisions. We rate over 4,200 stocks daily and provide 5-page PDF reports for each stock. These ratings can change daily and today's changes are reflected in the email below. If you are looking to check-up on the stocks you currently own or are looking for new ideas, you can find our full database of password-protected ratings reports in our proprietary ratings screener: http://www.thestreet.com/k/qr/flat/stock-screener.html Upgrades: ETH Downgrades: HAL, JNPR Initiations: None Read on to get TheStreet Quant Ratings' detailed report:
              Ratings Changes Today   
    TheStreet Quant Ratings provides fair and objective information to help you make educated investing decisions. We rate over 4,200 stocks daily and provide 5-page PDF reports for each stock. These ratings can change daily and today's changes are reflected in the email below. If you are looking to check-up on the stocks you currently own or are looking for new ideas, you can find our full database of password-protected ratings reports in our proprietary ratings screener: http://www.thestreet.com/k/qr/flat/stock-screener.html Upgrades: HALL, LAKE, PRGS, QNST, USAP Downgrades: AUDC, CSTE Initiations: None Read on to get TheStreet Quant Ratings' detailed report:
              Database Administrator - Level 3 - SPLICE - Calgary, AB   
    The planning, design, and development of new database applications; Are you an experienced Database Administrator who takes pride in keeping projects running...
    From Splice - Fri, 30 Jun 2017 23:54:41 GMT - View all Calgary, AB jobs
              Business Analyst (12 Month Contract) - Bayer Canada - Calgary, AB   
    60% Database design, support, administration and development:. 12 Month Contract....
    From Bayer Canada - Fri, 30 Jun 2017 20:45:03 GMT - View all Calgary, AB jobs
              Administrative Assistant - Tufts Health Plan - Watertown, MA   
    External applicants must demonstrate potential to become proficient in Lotus Notes e-mail, Lotus Notes calendar, Lotus Notes databases and other Tufts Health...
    From Tufts Health Plan - Wed, 05 Apr 2017 21:40:47 GMT - View all Watertown, MA jobs
              FAQ   
    Akismet checks your comments and contact form submissions against our global database of spam to protect you and your site from malicious content.
              Database Developer/Administrator - Massachusetts Eye and Ear Infirmary - Boston, MA   
    Communicates problems that might affect the success or schedule of any assigned project in a timely and appropriate fashion to team members, Manager and the CIO...
    From Massachusetts Eye and Ear Infirmary - Thu, 20 Apr 2017 11:49:32 GMT - View all Boston, MA jobs
              Product Database Specialist - Teknion Limited - Toronto, ON   
    Baan, Operations, PPG etc. Teknion creates furniture that connects people, technology and spaces....
    From Teknion Limited - Fri, 23 Jun 2017 23:43:48 GMT - View all Toronto, ON jobs
              Accounting Assistant (Part-time) - Southern Current LLC - Charleston, SC   
    Assist controller with various accounting and HR duties. 12-20 hours per week. Accounts payable duties to include posting invoices, maintaining vendor database,... $12 - $15 an hour
    From Indeed - Fri, 17 Mar 2017 19:21:27 GMT - View all Charleston, SC jobs
              Specialist, Public Relations - Indigo Books & Music - Toronto, ON   
    Ongoing development and maintenance of Indigo public relations media and blogger database. Build and maintain strong, positive relationships with media teams...
    From Indigo Books & Music - Thu, 22 Jun 2017 03:09:44 GMT - View all Toronto, ON jobs
              RankChart - Compare Site Performance, Alexa History Charts and Statistics   
    RankChart History Database. Compare your site rank position to competitors. Track your website performance even if it's under 300k. Find top sites.
              Physician Order Entry Or Nurse Order Entry? Comparison of Two Implementation Strategies for a Computerized Order Entry System Aimed at Reducing Dosing Medication Errors   
    Background: Despite the significant effect of computerized physician order entry (CPOE) in reducing nonintercepted medication errors among neonatal inpatients, only a minority of hospitals have successfully implemented such systems. Physicians' resistance and users' frustration seem to be two of the most important barriers. One solution might be to involve nurses in the order entry process to reduce physicians’ data entry workload and resistance. However, the effect of this collaborative order entry method in reducing medication errors should be compared with a strictly physician order entry method. Objective: To investigate whether a collaborative order entry method consisting of nurse order entry (NOE) followed by physician verification and countersignature is as effective as a strictly physician order entry (POE) method in reducing nonintercepted dose and frequency medication errors in the neonatal ward of an Iranian teaching hospital. Methods: A four-month prospective study was designed with two equal periods. During the first period POE was used and during the second period NOE was used. In both methods, a warning appeared when the dose or frequency of the prescribed medication was incorrect that suggested the appropriate dosage to the physicians. Physicians’ responses to the warnings were recorded in a database and subsequently analyzed. Relevant paper-based and electronic medical records were reviewed to increase credibility. Results: Medication prescribing for 158 neonates was studied. The rate of nonintercepted medication errors during the NOE period was 40% lower than during the POE period (rate ratio 0.60; 95% confidence interval [CI] .50, .71;P < .001). During the POE period, 80% of nonintercepted errors occurred at the prescription stage, while during the NOE period, 60% of nonintercepted errors occurred in that stage. Prescription errors decreased from 10.3% during the POE period to 4.6% during the NOE period (P < .001), and the number of warnings with which physicians complied increased from 44% to 68% respectively (P < .001). Meanwhile, transcription errors showed a nonsignificant increase from the POE period to the NOE period. The median error per patient was reduced from 2 during the POE period to 0 during the NOE period (P = .005). Underdose and curtailed and prolonged interval errors were significantly reduced from the POE period to the NOE period. The rate of nonintercepted overdose errors remained constant between the two periods. However, the severity of overdose errors was lower in the NOE period (P = .02). Conclusions: NOE can increase physicians' compliance with warnings and recommended dose and frequency and reduce nonintercepted medication dosing errors in the neonatal ward as effectively as POE or even better. In settings where there is major physician resistance to implementation of CPOE, and nurses are willing to participate in the order entry and are capable of doing so, NOE may be considered a beneficial alternative order entry method.

    Purchases: 0
              Build a Website (Copy another website) by DatabaseBusiness   
    ** Bid your best price - explain to me what you will do and not do. Lowest price with most done will win! Provide timeline as well. This is a simple website project - please copy this website (with a few modifications) - it's about 10 pages... (Budget: $50 - $100 USD, Jobs: Graphic Design, HTML, PHP, Website Design)
              Accounting Assistant (Part-time) - Southern Current LLC - Charleston, SC   
    Assist controller with various accounting and HR duties. Accounts payable duties to include posting invoices, maintaining vendor database, ensuring proper $12 - $15 an hour
    From Indeed - Fri, 17 Mar 2017 19:21:27 GMT - View all Charleston, SC jobs
              Online CD Rental Software and cd database software   
    Online cd Rental Software provides open attribute architecture, it is a unique rental script which can be easily customized for any vertical business industry, for any commodity.
              Mobiliser - Sunil S.kulkarni - Hubli, Karnataka   
    Candidate has to have experience in mobilising candidates for varied courses by going into villages collecting database, doing seminars, counselling and ₹15,000 a month
    From Babajob - Sat, 24 Jun 2017 10:06:24 GMT - View all Hubli, Karnataka jobs
              Reply #876   
    Hey, thanks for the warning. I'm still trying to get my North Carolina databases right so I can do the same workups I do in Virginia. There's one more thing I'm going to work on tonight, then I should be good to go.

    For tonight I have:

    011 052 149 205 208 240 273 325 326 333 437 461 503 517 551 575 612 651 791 830 837 854 858 882 900 934 943 982

    I may tweak that a little more, but it's a good start
              More certain   

    Searching our database for: More certain crossword clue answers and solutions. This crossword clue was seen today at Independent Concise Crossword July 2 2017. Found 1 possible solution matching the query More certain that you searched for. Kindly check the possible answer below and if it’s not what you are looking for then use the […]

    The post More certain appeared first on DailyCrosswordSolver.co.uk.


              Indian currency unit   

    Searching our database for: Indian currency unit crossword clue answers and solutions. This crossword clue was seen today at Independent Concise Crossword July 2 2017. Found 1 possible solution matching the query Indian currency unit that you searched for. Kindly check the possible answer below and if it’s not what you are looking for then […]

    The post Indian currency unit appeared first on DailyCrosswordSolver.co.uk.


              Midday   

    Searching our database for: Midday crossword clue answers and solutions. This crossword clue was seen today at Independent Concise Crossword July 2 2017. Found 1 possible solution matching the query Midday that you searched for. Kindly check the possible answer below and if it’s not what you are looking for then use the search form […]

    The post Midday appeared first on DailyCrosswordSolver.co.uk.


              Tableland   

    Searching our database for: Tableland crossword clue answers and solutions. This crossword clue was seen today at Independent Concise Crossword July 2 2017. Found 1 possible solution matching the query Tableland that you searched for. Kindly check the possible answer below and if it’s not what you are looking for then use the search form […]

    The post Tableland appeared first on DailyCrosswordSolver.co.uk.


              Rice dish   

    Searching our database for: Rice dish crossword clue answers and solutions. This crossword clue was seen today at Independent Concise Crossword July 2 2017. Found 1 possible solution matching the query Rice dish that you searched for. Kindly check the possible answer below and if it’s not what you are looking for then use the […]

    The post Rice dish appeared first on DailyCrosswordSolver.co.uk.


              Colourless gas   

    Searching our database for: Colourless gas crossword clue answers and solutions. This crossword clue was seen today at Independent Concise Crossword July 2 2017. Found 1 possible solution matching the query Colourless gas that you searched for. Kindly check the possible answer below and if it’s not what you are looking for then use the […]

    The post Colourless gas appeared first on DailyCrosswordSolver.co.uk.


              Catch fire   

    Searching our database for: Catch fire crossword clue answers and solutions. This crossword clue was seen today at Independent Concise Crossword July 2 2017. Found 1 possible solution matching the query Catch fire that you searched for. Kindly check the possible answer below and if it’s not what you are looking for then use the […]

    The post Catch fire appeared first on DailyCrosswordSolver.co.uk.


              Sicilian volcano   

    Searching our database for: Sicilian volcano crossword clue answers and solutions. This crossword clue was seen today at Independent Concise Crossword July 2 2017. Found 1 possible solution matching the query Sicilian volcano that you searched for. Kindly check the possible answer below and if it’s not what you are looking for then use the […]

    The post Sicilian volcano appeared first on DailyCrosswordSolver.co.uk.


              Recount   

    Searching our database for: Recount crossword clue answers and solutions. This crossword clue was seen today at Independent Concise Crossword July 2 2017. Found 1 possible solution matching the query Recount that you searched for. Kindly check the possible answer below and if it’s not what you are looking for then use the search form […]

    The post Recount appeared first on DailyCrosswordSolver.co.uk.


              Hundreds of Atheists Try to Bring Down Atheist Movie IMDb Rating   
    Contact: Jen Thompson, 800-437-1893, jthompson@livingwaters.comLOS ANGELES, Oct. 25, 2016 /Christian Newswire/ -- Hundreds of atheists have left a one-star rating at the Internet Movie Database in an effort to discredit a new movie called, "The Atheists Delusion." This, according to the film's producer, Ray Comfort. Comfort said, "When I saw all the bad reviews, I encouraged Christians to watch the movie, then go to www.imdb.com/title/tt5910814/ratings?re Source: Living Waters
              Holdem Wizard 2.0.7.2   
    Holdem Wizard automatically counts player's chances Odds and Outs and also the chances of the opponents. This information is updated during all stages of the game. The results take into account the number of opponents and they are both numerically and graphically displayed. For calculations Holdem Wizard uses a database which consists of each player himself his own game.
              The database of the PREDICTS (Projecting Responses of Ecological Diversity In Changing Terrestrial Systems) project.   
    none
              Change User login permissions on a sql 2005 database that gets overwritten daily   

    LiveDBUser was intended to be replaced by the usernames you wanted to deny access to.  Sorry I didn't state that.


              Change User login permissions on a sql 2005 database that gets overwritten daily   

    Ok, so I managed to get it working with the following script:

    use [Database_OLD]

    GO

    GRANT CONNECT TO [user1]

    GO

     

    This isn't ideal since I have to manually edit the script if I add users to sql.  LiveDBUser doesn't work.  What statement do i use to call all users?


              Change User login permissions on a sql 2005 database that gets overwritten daily   

    You would open the maintenance task and add a step.  On my 2008 server I'd scroll down the left side of the SSMS utility to the "Management" area, and unfold it to "Maintenance Plans" the right-click and modify the appropriate maintenance plan. 

    Just add an appropriate Maintenance plan task to "Execute T-SQL Statement".  Add the appropriate SQL and hope you've not just locked all users out of all databases. 

    That little joke is just my way of saying you should be careful. 


              Change User login permissions on a sql 2005 database that gets overwritten daily   

    SQL Dummy here, can I bother you guys for a step by step on how to add the script and what exactly the script would be?


              Change User login permissions on a sql 2005 database that gets overwritten daily   

    Hi,

    I would add another step to the SQL Server Agent Job that runs the backup and restore which runs the deny script mentioned by David.


              Change User login permissions on a sql 2005 database that gets overwritten daily   

    I would probably just add lines to the script that makes the backup, which would change the permissions on the database. 

    DENY ALL to LiveDBUser

    or something to the effect.


              Change User login permissions on a sql 2005 database that gets overwritten daily   

    I have a Sql backup and restore maintenance plan setup that backs up and restores the production database nightly on the same server.  We use this as a test database.  I would like to prevent users from accidentally logging into this database even though it is clearly named a test database.  There have been times when we use this test database at a user workstaion only to find that after we logoff the test database remains on the login screen because sql remembers the last database you logged into. When the user returns, they inadvertantly log into the test database and resume work.  I can manually change the permissions of this test database to deny logins, but these permissions get overwritten daily.  Any ideas?

    This topic first appeared in the Spiceworks Community
              Systems Analyst (Teradata Database Developer) - McCain Foods (Canada) - Florenceville, NB   
    McCain Foods is seeking a Systems Analyst, specialized in Teradata database development, to contribute to the success of our Enterprise Data Warehouse (EDW)...
    From McCain - Tue, 04 Apr 2017 12:46:27 GMT - View all Florenceville, NB jobs
              Marketing Manager - DAP Products Inc. - Ontario   
    This includes development and management of a database warehouse of consumer focused and retail sales fact based data....
    From DAP Products Inc. - Wed, 07 Jun 2017 00:02:06 GMT - View all Ontario jobs
              Senior Database Developer   

              SQL Database Administrator   

              Purchasing & Estimating Agent - Crescent Homes SC LLC - Charleston, SC   
    Must be able to use word processing, e-mail, spreadsheets, database software, creation of reports and database maintenance....
    From Crescent Homes SC LLC - Sun, 07 May 2017 10:24:30 GMT - View all Charleston, SC jobs
              Accounting Assistant (Part-time) - Southern Current LLC - Charleston, SC   
    Accounts payable duties to include posting invoices, maintaining vendor database, ensuring proper authorization of invoices and processing check requests.... $12 - $15 an hour
    From Indeed - Fri, 17 Mar 2017 19:21:27 GMT - View all Charleston, SC jobs
              Part-Time Dispatcher/Customer Service Rep - Cash for Trash - Stittsville, ON   
    Proficient computer skills (email, google, google drive, word, excel online databases etc). Handling and balancing cash....
    From Indeed - Wed, 24 May 2017 17:57:41 GMT - View all Stittsville, ON jobs
              Dispatcher/Customer Service Rep - Cash for Trash - Stittsville, ON   
    Proficient computer skills (email, google, google drive, word, excel online databases etc). Handling and balancing cash....
    From Indeed - Wed, 24 May 2017 17:49:26 GMT - View all Stittsville, ON jobs
              Billboard-based Internet Marketing Campaign and Tracking & Measuring ROI   

    Placing an advertisement on a billboard in a high traffic area will surely bring results. Measuring it however, can be a tough job. For each marketing campaign, measuring performance is crucial. Measuring the efficiency of a advertisement depends on 2 things:

    1. Clickthrough Rate (CTR) and 
    2. Return on Investment (ROI). 

    CTR

    For a billboard, Clickthrough Rate refers to the percentage of the desired action from impressions. Unless eye and camera tracking technology improves, it is not possible to monitor the clickthrough rate for a billboard.

    ROI

    However, monitoring the return on marketing investment or ROI from a billboard is very much possible - If you know your traffic sources. But, to ensure the tracking and measuring ROI of a billboard you have to turn it into an internet marketing campaign involving call to action using internet technology.

    Hire us for Interactive Marketing - How to Turn Your Billboard into Interactive Marketing Campaign

    If you invest in Billboard Advertising, then you can take some initiatives to make your display more interactive while effectively measuring it’s ROI. viz.

    Using a Landing Page URL

    A website on your billboard is good for marketing and branding but does not allow you to properly measure ROI from multiple campaigns that direct on your domain. The best way to measure ROI, is referring to an alternate URL, one that does not disturb your expected web traffic. For the URL, you can:

    • Create a new isolated/orphan landing page on your site (make sure it is blocked from search engines and not used in other web marketing campaigns)
    • Purchase a new domain and refer the billboard visitors to that site (you can also make it redirect to the landing page on your site or just use URL masking or use frames/iframes)
    • Use the existing landing page with parameters built by Google URL Builder (it allows you to see custom reports in web analytics from ad variations reports)
    • Use an easy to read bit.ly shortened link. bit.ly specifically delivers analytics that you can use to measure traffic from your advertisement. Other URL Shortener Services that supply analytics will work just as well.

    The URL visits can be easily tracked with Google Analytics or any other Web Analytics application. This gives the campaign transparency and allows for high ROI visibility. The URL you refer on the billboard should be easily readable. Long URLs won’t typically generate action. Often, most conversions happen after the fact, so it is wise to help the readers remember your URL.

    Using QR Codes

    QR Codes have been around for long time, but the use of these 2-dimensional codes for Interactive Marketing is a new trend. It helps you to download information or generate a specific action on your mobile phone using widely available bar-code scanners. For the iPhone (Semacode), Android Phones (QuickMark), Blackberry (ScanLife), Symbian Phones or Nokia (UpCode) are among several competing apps that can read QR Codes from any printed media or digital screen. QR Codes are now being used for Internet Marketing, Mobile Marketing, and even Word of Mouth Marketing. With QR Codes, any camera-enabled smartphone users can:

    • Be referred to a specific URL
    • Get a phone number, allowing them to can call instantly
    • Download business cards to save on their phone and make calls at a later time
    • Get an email address and/or enable them to send emails from phone instantly
    • Be referred to a geographical location on Google Maps and help them to come to your place of business
    • Can be sent an event invitation, which can be saved in phone’s calendar
    • Be shown a text message (This can also be a coupon)
    • Get an SMS template to send queries to your business cellphone number
    • Gain access to your sponsored WiFi Network, etc.

    The interaction with QR Codes are increasing day by day and steadily becoming the bridge between the real and virtual worlds. Determine the type of interaction that will deliver the best visibility of your ROI. There are many sites that allows you to generate QR codes (e.g. Zxing Projects QR Generator). A major service currently in use is the Google Chart API. Here is an article on how you can generate QR Codes in bulk with Google Chart API.

    Bluetooth Advertising Near the Billboard

    Bluetooth Adverts are the newest trend in mobile marketing. If your billboard is in convenient location, then you can send more interactive information and coupons to your billboard Bluetooth users. You can easily transmit richer content to your viewers. Moreover you can also digitally track the Bluetooth downloads.

    Email Tracking

    Email is one of the most convenient ways of communication, even on mobile devices. Make sure you have a different email address on the billboard than the one you use in other media campaigns. This will allow you to verify the emails you get on the allocated address are coming from billboard advert. If you incentivize people sending emails to that address e.g. with coupon, then you can also trigger the behavior of the billboard audience to send emails to the custom address rather than other addresses or web contact forms.

    Using Call Analytics

    HostedNumbers.com, AdInsight.eu, and Mongoose Metrics have call tracking services that you can be monitored from Google Analytics. However, there are several other call tracking and analytics softwares that your can use. They generate a unique phone number (even toll-free) that you can use on the billboard. When someone calls, you can see how much traffic came from your billboard advertisement and then use Google Analyitcs to see how long they were on the call. Again, it is imperative that this phone number is not made available elsewhere. There will always be a risk that word of mouth marketing will ruin your direct analytics. This information being shared is beyond your control.

    Using Coupons

    Coupons are one of the best ways to track where a customer has come from. You can mention an easily readable coupon code for all (e.g. gimme20%less) or generate a custom coupon code from any specific URL mentioned on the billboard. The URL can be a purchased domain, a landing page on your site, or a easily readable shortened link. This enables you to distinguish between regular web traffic and traffic from your billboard. Numerous softwares and online services will allow you to easily generate custom coupons through the URL. If you are sending the billboard reader to a URL with a custom coupon, it is recommended to have them visit a custom landing that is blocked from search engines. Once people use their coupons, you are able to retreive information on traffic sources using the database. It is imperative that you make sure that the coupon codes can be tracked. To get a true ROI on that particular segment of your campaign, ensure that it is not made available in other forms of media (e.g. search engine, word of mouth, etc).

    ©  of SEOPPCSMM.COM


              Who’s Afraid of Front-End Databases? Slide Deck and Demos   
    Last evening I had the pleasure of presenting a lightning talk in AngularJS Labs Meetup in London. First of all, I want to thank the great audience for coming to hear the sessions in the Meetup. I also want to thank Gerard Sans for the invitation to speak and for organizing a great evening.My session […]
              DOES YOUR COMPANY REQUIRE APPLICANT TRACKING ASSISTANCE? - A RIGHT RESUME   
    A Right Resume Applicant Tracking Division features the RESUMate Database for safe and secure storage of all resumes that are administered by our expert staff. In addition, we provide continuous management and retrieval of your business applicants...(click link to see full post)
              VIDEO: Grand Forks startup creates software for aerial wind turbine inspection   
    By Grand Forks Herald
    A pockmark the size of a penny on a 154-foot wind turbine blade doesn't look like much, but with time can turn one into a menace.

    If left unattended, the damage could grow and result in expensive repairs, the likes of which Grand Forks startup company EdgeData is trying to prevent through inspections completed by unmanned aircraft.             
    EdgeData employees aren't behind the controls of the aircraft but rather the digital platform analyzing photos collected by a camera attached to the device.

    "We take images of wind turbines in the field and turn that into consumable data," EdgeData President Chris Shroyer said Thursday.

    Cameras mounted on unmanned aircraft are capable of capturing hundreds or thousands of pictures in a single flight, but those images may be unusable unless processed in a way that shows the customer exactly what they want to see, including the location of potential problems on a blade.

    "Instead of looking at 900 pictures, we're only looking at five," Director of Operations Greg Thorsteinson said.

    The software, called BladeEdge, allows the company to remove organic material, such as dirt and bird feces, from images so customers can get a better picture of damage to a blade. The analysis produced by the company's software also tracks the conditions of blades and automatically sets up maintenance cycles.

    Inspection progress
    Maintaining a database of information about blade condition is critical for turbine owners and manufacturers.

    EdgeData's software allows owners to track repair needs and provides manufacturers with information about their product's performance.

    In Grand Forks, one of those manufacturers is LM Wind Power, which produces three types of wind blades and is often the first point of contact when customers notice something is wrong with their turbine.

    Catching problems early is key to prolonging the life of the blades, according to John Jeno, senior engineering manager at LM Wind Power Blades Inc.

    "Blades are a wear item—they need to be maintained. And if you catch (damages) when they're small, it's a very simple, fast fix," Jeno said. "If you don't take care of it and a year goes by, then you're talking about a couple of guys and climbing platforms. ... That becomes an expensive repair."

    Under normal conditions, the blades spin at about 100 mph. Buildup from bugs, dirt, dust and bird feces all affect the blades performance, as do collisions with objects such as hailstones and grit, which erode the protective coating on the blade.

    "If you lose some of these coats, especially on the leading edge (of the blade), it can degrade very quickly and rapidly," Shroyer said.

    Finding erosion, cracks and other damage falls to an inspection crew. Without an unmanned aircraft, a three-man crew climbing the structure would complete an inspection on one turbine in the course of a day.

    "You're hanging a guy off a rope with a three-man team, and that's expensive. It's time-consuming and it puts somebody at risk," Jeno said. "Or you're bringing in a crane to get up 345 feet."

    The high cost of inspections means some turbine owners defer maintenance until a problem grows too large to ignore. The use of unmanned aircraft will likely bring a level of affordability to inspections that will allow owners to perform them more often on their turbines, Jeno said.

    The aircraft coupled with the software is expected to make quick work of future inspections, with Shroyer predicting as many as 10 a day once technology is advanced enough. For now, he estimates a unmanned aircraft system team could complete five inspections a day.

    In-house flights
    EdgeData has had opportunities to put its software to the test in the region and on the East Coast, as its crews have flown around turbines in North Dakota and Massachusetts.

    The company also has sections of turbine blades mounted in its office, located in the UND Tech Accelerator. The presence of the parts allows the company to conduct research when weather conditions outside are not ideal, Shroyer said.

    A 23-foot blade tip—an entire blade is about 154 feet—stands upright in the office and, complete with nicks and other damage to be detected during flights, serves as a practice specimen.

    As Edgedata deals in data analysis, it partners with other firms to complete flights. In Grand Forks, one possible partner is another startup, SkySkopes, which offers aerial photography and video services for inspecting infrastructure.

    "EdgeData is an asset to the (UAS) ecosystem, especially locally," Skyskopes President Matt Dunlevy said, referring to the fact that small UAS-related companies in Grand Forks usually have a specialization and, therefore, often form partnerships with one another to conduct research and advance the technology.

    Founded in December, EdgeData has operated from its current office since May.

    The month prior, it received a $450,000 grant from Research ND, a program facilitated by the state Department of Commerce.

    With a specialty in data, the company has a number of security measure in place to protect the data it captures for customers. Larger data collection means more security rules, as some wind farms are considered critical power infrastructure and need to be protected from hacking.

    "Security is the highest priority," Thorsteinson said.

    VIDEO: Grand Forks startup creates software for aerial wind turbine inspection - Grand Forks Herald

    read more


              [J.F. Sebastian] Craigslist. It seemed fishy... but the holograms were convincing. The box was r...   
    Craigslist. It seemed fishy... but the holograms were convincing. The box was right. It installed fine too. Worked perfectly for some time until Microsoft spied broke into my machine and identified it as counterfeit.
    Not really sure I'd consider that "spying" since all they have to do is check your serial number in their database, the one that gets uploaded when you install it, but that doesn't explain why it worked in the first place.

    I consider spying to be something more like the way they use Cortana, the built-in voice assistant in Windows 10.

    Cortana: The spy in Windows 10

    In any case, I would never buy supposedly new software from Craigslist just to save a few bucks. It's simply not worth it, as you found out. Craigslist goods and people can be dodgy enough without making it easy for them to rip you off. You're better off buying from eBay where you can check a seller's reputation and make sure they have a long history of providing genuine goods. Craigslist is always a last resort for me as a buyer.

              A birder gives wings to three centuries of South Asian ornithology   
    Aasheesh Pittie has single-handedly indexed a monumental bibliographic database and made it searchable through keywords for naturalists worldwide
               Ordering Disorder : ninemsn, Hypertext and Databases    
    Jarrett, Kylie (2005) Ordering Disorder : ninemsn, Hypertext and Databases. M/C: a journal of media and culture, 7 (6). pp. 1-3. ISSN 1441-2616
               DSEARCH: sensitive database searching using distributed computing    
    Keane, T.M. and Naughton, T.J. (2005) DSEARCH: sensitive database searching using distributed computing. Bioinformatics, 21. pp. 1705-1706.
              Genes Linked to Memory Identified, Opens New Avenues to Brain Research    
    More than 100 genes linked to memory were identified by a research team from UT Southwestern Medical Center. This could open new avenues of research to

               SleepOff (Utilities)   

    SleepOff 1.0


    Device: iOS iPad Only
    Category: Utilities
    Price: Free, Version: 1.0 (iTunes)

    Description:

    Optimized for all iPad starting from iOS 8. You replaced your old iPad with a brand new one and you don’t know what to do with your old device who still works very well? Recycle it into a sophisticated Internet clock radio!

    FUNCTIONS:
    — Displaying of date and time formatted according to the selected language.
    — Listen to your favorite Internet radio through a built-in database of over 4,000 stations.
    — The selection of radio stations is done using three filters, country, language and gender.
    — Store up to 6 different stations accessible with a single tap.
    — Adjust the volume and the brightness of the screen with a single slide on the screen.
    — Changing the language of the interface regardless of the language of the iPad. Currently, French and English are supported. Other languages​ will be added according to demand.
    — By default, the display simulates a monochrome plasma green hue. The plasma color can be changed according to user preferences. It is also possible to change the background color for a more varied display.
    — Set listening to your favorite radio station at the time of your choice with sophisticated alarms. Each alarm let choose:
    * the days of the week
    * The start time
    * Listening time
    * The snooze duration time if you want to sleep a little bit longer
    * The fade in duration to gradually increase the volume to avoid brutal wake up
    * The radio station among those stored

    — Fall asleep while listening to your favorite radio. In addition to the playing time, set the fade out to gradually reduce the volume to accompany your dive into sleep.

    SleepOff is designed to remain permanently active. However, if the radio is on when passing in the background, it will continue to play. Similarly, if an alarm goes off when the app is in the background, a notification is displayed by emitting a sound pulse for 30 seconds.

    SleepOff


              Database Administrator - Level 3 - SPLICE - Wawa, ON   
    The planning, design, and development of new database applications; Are you an experienced Database Administrator who takes pride in keeping projects running...
    From Splice - Fri, 30 Jun 2017 23:54:41 GMT - View all Wawa, ON jobs
              DATABASE MODELLER / IM MODELLER, LEVEL 3 - SPLICE - Wawa, ON   
    Providing technical expertise in the use and optimization of data modeling techniques to team members; Designing, developing and maintaining logical AND...
    From Splice - Mon, 15 May 2017 19:13:05 GMT - View all Wawa, ON jobs
              History.dat v1.87   
    Une nouvelle version de ce fichier informatif pour MAME a été publiée. Voici les données modifiées:
    - Updated to MAME 0.187
    - Include ann the database updates since 1.86.
    Télécharger History.dat v1.87
    Site Officiel de History.dat

              SandstoneDb GOODS adaptor   

    SandstoneDb was written mostly as a rails'ish API for a simple object database for use in small office and prototype applications (plus I needed a db for this blog). Which object database wasn't really important to me at the time, it was the API that I wanted, so I made the actual object store backing it pluggable and initially wrote two different store adaptors for it. The first was a memory store which was little more than a dictionary of dictionaries against which I wrote all the unit tests. The second was a prevayler style file based store that used SmartRefStream serialization and loaded everything from disk on startup; this provided a crash proof Squeak images which wouldn't lose data.

    I figured eventually, for fun I might get around to writing adaptors for some of the other object database back-ends that are in use: GOODS and Omnibase. I never really got around to it; however, Nico Schwarz has written a GOODS adaptor for SandstoneDb. This will let you hook up multiple squeak images to a single store and should scale better than the file store that SandstoneDb defaults to.

    Go check it out and let him know what you think of it. This is just the kind of project that'll help programmers new to Seaside get going and get accustomed to using an object database rather than a relational one. It looks like his first blog post as well, so swing by and leave a comment to encourage more posts, we need more bloggers spreading the word!


              OnSmalltalk is Now Written In Smalltalk   

    OK, this blog is finally written in Seaside; no more chasing the latest Wordpress version. Just makes more sense for a blog mostly about Smalltalk and Seaside to be written in Smalltalk and Seaside. I haven't felt like writing for a long time so I'm hoping this change will make the blog more fun and get me paying attention to it again; she's been neglected for a while.

    The code tops in at just under 800 lines so far for the Seaside, database and data migration, rss and atom feed, and google sitemap code. It could have been less if I didn't have to support the atom feed or deal with all the existing links or data from the old Wordpress site without breaking them, but that wasn't an option.

    Hopefully, it's not full of bugs, but it supports clean URLs and a nicer threaded Ajax comment system. It'll be interesting to see how managing it compares to managing Wordpress. No more huge library of plugins or community to turn to, just my own simple Smalltalk code with the features I need and only the features I need. Just hope the spam doesn't kill me, guess I'll find out soon enough.


              SandstoneDb, Simple ActiveRecord Style Persistence in Squeak   

    On Persistence, Still Not Happy

    Persistence is hard and something you need to deal with in every app. I've written about what's available in Squeak, written about simpler image based solutions for really small systems where just dumping out to one file is sufficient; however, nothing I've used so far has satisfied me completely for various reasons, so before I get to the point of this post, let me do a quick review of my current thoughts on the matter.

    Relational Databases

    Tired of em, I don't care how much they have to offer me in the areas of declarative indexing and queries, transactions, triggers, stored procedures, views, or any of the handful of things they offer that I don't really want from them. The price they make me pay in programming just isn't worth it for small systems. I don't want my business logic in the database. I don't want to use a big mess of tables to model all my data as a handful of global variables, aka tables, that multiple applications share and modify freely. What I do want from them, transactional persistence of my object model, they absolutely suck at and all attempts to shoehorn an object model into a relational database ends up being an exercise in frustration, compromise, and cussing. I think using a database as an integration point between multiple applications is a terrible idea that just leads to a bunch of fragile applications and a data model you can't change for fear of breaking them. Enough said, on to more object oriented approaches!

    Active Record

    Ruby on Rails has brought the ActiveRecord pattern mainstream, which was as far as I know, first popularized in Martin Fowler's book Patterns Of Enterprise Application Architecture which largely dealt with all the various known methods of mapping objects to databases. Initially I wasn't a fan of the pattern and preferred the more complex domain model with a meta data mapping, but having written an object relational mapper at a previous gig, used several open source ones, as well as tried out several pure object databases, I've come to appreciate the simplicity and explicitness of its simple API.

    If you have to work with a relational database, this is a fairly good compromise for doing so. You can't bind a real object model to a relational database cleanly without massive effort, so don't try, just revel in the fact that you're editing rows rather than trying to hide it. It works reasonably well, and it's easy to get other team members to use it because it's simple.

    "Simplicity is the ultimate sophistication" -- Leonardo Da Vinci

    Other Approaches

    A total OO purist, or a young one still enamored with patternitis, wouldn't want objects to save themselves like an ActiveRecord does. You can see this in the design of most object oriented databases available, it's considered a sin to make you inherit from a class to obtain persistence. I used to be one of those guys too, but I've changed my mind in favor of pragmatism. The typical usage pattern is to create a connection to the OODB server which basically presents itself to you as a persistent dictionary of some sort where you put objects into it and then "commit" any unsaved changes. They will save any object and leave it up to you what your object should look like, intruding as little as possible on your domain, so they say.

    Behind the scenes there's some voodoo going on where this persistent dictionary tries to figure out what's actually been changed either by having installed some sort of write barrier that marks objects dirty automatically when they get changed, comparing your objects to a cached copy created when they were originally read, or sometimes even explicitly forcing the programmer to manually mark the object dirty. The point of all of this complexity of course, is to minimize writes to the disk to reduce IO and keep things snappy.

    Simplicity Matters

    What seems to be overlooked in this approach is the amount of accidental complexity that is imposed upon the programmer. If I have to open a connection to get a persistent dictionary to work with, I now have to store this configuration information, manage the creation of this connection, possibly pool it if it's an expensive resource, and decide where to hang this dictionary so I can have access to it from within my application. This is usually some sort of current session object I can always reach such as a WASession subclass in Seaside. Now, this all actually seems pretty normal, but should it be?

    I'm not saying this is wrong, but one has to be aware of the trade-offs made for any particular API or style. At some point you have to wonder if we're not suffering from some form of technical Stockholm syndrome where we forget that all this complexity is killing us and we forget just how painful it really is because we've grown accustomed to it.

    Sit down and try explaining one of your programs that use some of this stuff to another programmer unfamiliar with your setup. If you really pay attention, you'll notice just how much of the explaining you're doing has nothing to do with the actual problem you're trying to solve. Much of it is just accidental complexity for plumbing and scaffolding that crept in. If you spend more time explaining the persistence framework than your program and the actual problem it's solving, then maybe that's a problem you'll want to revisit sometime. Do I really want to write code somewhat like...

    user := User firstName: 'Ramon' lastName: 'Leon'.
    self session commit: [ self session users at: user id put: user ].
    

    with all the associated configuration setup and cognitive load of remembering what I called the accessor to get #users and how I'm hashing the user for this or that class while remembering the semantics of what exactly is committed, or whether I forgot to mark something dirty, or would I rather do something more strait forward and simple like this...

    user := User firstName: 'Ramon' lastName: 'Leon'.
    user save.
    

    And just assume the object knows how to persist itself and there's no magic going on? If I say save I just know it commits to disk, whether there were any changes or not. No setup, no configuration, no magic, just save the damn object already.

    Contrary to popular belief, disk IO is not the bottleneck, my time is the bottleneck. Computers are cheap, ram is cheap, disks are cheap, programmer's time is usually by far the largest expense on any project. Something simple that just works OK but solidly every time is far more useful to me than something complex that works really really well most of the time but still breaks in weird ways occasionally, forcing me to dig into someone else's complex code for change detection or topological insertion sorting and blow a week of programmer time working on god damn plumbing. I want to spend as much time as possible when programming working on my actual problem, not fighting with the persistence framework to get it to behave correctly or map my object correctly.

    A Real Solution

    Of course, GemStone is offering GLASS, a 4 gig persistent image that just magically solves all your problems. That will be the preferred option for persistence when you really need to scale in the Seaside world, and I for one will be using it when necessary; however, it does require a 64 bit server and introduces the small additional complexity of changing to an entirely different Smalltalk and learning its class library. Definitely an option if you outgrow Squeak. But will you? I'll get into GemStone more in another post when I can get more into it and give it the attention it deserves, but my main point now is that there's still a need for simple GemStone'ish like persistence for Squeak.

    Reality Check

    Let's be honest, most apps don't need to scale. Most apps in the real world are written to run small businesses, what DHH calls the fortune five million. The simple fact is, in all likelihood scaling is not and probably won't ever be your problem. We might like to think we're writing the next YouTube or Twitter, but odds are we're not. You can make a career just replacing spread sheets from hell with simple applications that make people lives easier without ever once hitting the limits of a single Squeak image (such was the inspiration for DabbleDb), so don't waste your time scaling.

    You don't have a scaling problem unless you have a scaling problem. Even if you do have an app that needs to scale, it'll probably need 2 or 3 back end supporting applications that don't and it's a waste of time making them scale if they don't need too. If scaling ever becomes a problem, be happy, it's a nice problem to have unless you're doing something stupid like giving away all of your services for free and hoping you'll figure out that little money thing later on.

    Conventions Rule

    Ruby on Rails has shown us that beyond making things easier with ActiveRecord, things often need to be made more structured and less configurable. Configuration is a hidden complexity that Java has shown can kill any chance for any real productivity, sometimes having more configuration than actual code. It's amazing how much simpler programs can get if you just have the guts to make a few tough choices, decide how you want to do things, and always do it that way. Ruby on Rails true contribution to the programming community was its convention over configuration philosophy, ActiveRecord itself was in use long before Rails.

    Convention over configuration is really just a nice way of the framework writer saying "This is how it's done and if you don't like it, tough." The problem then of course becomes finding a framework with conventions you agree with, but it's a big world, you're probably a programmer if you're reading this, so if you can't find something, write your own. The only problem with other people's frameworks, is that they're other people's frameworks. There's nothing quite like living in a world of your own creation.

    What I Wanted

    I wanted something like ActiveRecord from Rails but not mapped to a relational database, that I could use with Seaside and Squeak for small applications. I've accepted that if I need to scale, I'll use GemStone, this limits what I need from a persistence solution for Squeak.

    For Squeak, I need a simple, fast, configuration free, crash proof, easy to use object database that doesn't require heavy thinking to use, optimize, or explain to others that allows me to build and iterate prototypes and small applications quickly without having to keep a schema in sync or stop to figure out why something isn't working, or why it's too slow to be usable.

    I don't want any complex indexing schemes to be necessary, which means I want something like a prevalence system where all the objects are kept in memory all the time so everything is just automatically fast. I basically just want my classes in Squeak to be persistent and crash proof. I don't need a query language, I have the entire Smalltalk collections hierarchy at my disposal, and I sure as hell don't need SQL.

    I also don't want a bunch of configuration. If I want to find all the instances of a User in memory I can simply say...

    someUsers := User allInstances.
    

    Without having to first go and configure what memory #allInstances will refer to because obviously I want #allInstances in the current image. After all, isn't a persistent image what we're really after to begin with? Don't we just want our persistent objects to be available to us as if they were just always in memory and the image could never crash? Shouldn't our persistent API be nearly as simple?

    Since I'm basically after a persistent image, I don't need any configuration; the image is my configuration. It is my unit of deployment and I've already got one per app/customer anyway. I don't currently, nor do I plan on running multiple customers out of a single image so I can simply assume that when I persist an instance, it will be stored automatically in some subdirectory in the directory my image itself is in, overridable of course, but with a suitable default. If I want to host another instance of a particular database, I'll put another image in a different directory and fire it up.

    And now I'm finally getting to the point...

    SandstoneDb

    Since I couldn't find anything that worked exactly the way I wanted, though Prevayler was pretty close, I just wrote my own. It's a simple object database that uses SmartRefStreams to serialize clusters of objects to disk. Ordinary ReferenceStreams can mix up your instance variables when deserializing older versions of a class.

    The root of each cluster is an ActiveRecord / OODB hybrid. It makes ActiveRecord a bit more object oriented by treating it as an aggregate root and its class as a repository for its instances. I'm mixing and matching what I like from Domain Driven Design, Prevayler, and ActiveRecord into a single simple framework that suits me.

    SandstoneDb API

    To use SandstoneDb, just subclass SDActiveRecord and restart your image to ensure the proper directories are created, that's it, there is no further configuration. The database is kept in a subdirectory matching the name of the class in the same directory as the image. This is a Prevayler like system so all data is kept in memory written to disk on save; on system startup, all data is loaded from disk back into memory. This keeps the image itself small.

    Like Prevayler, there's a startup cost associated with loading all the instances into memory and rebuilding the object graph, however once loaded, accessing your objects is blazing fast and you don't need to worry about indexing or special query syntaxes like you would with an on disk database. This of course limits the size of the database to whatever you're willing to put up with in load time and whatever you can fit in ram.

    To give you a rough idea, loading up a 360 meg database containing about 73,000 hotel objects on my 3ghz Xeon Windows workstation takes about 57 minutes. That's an average of about 5k per object. Hefty and definitely pushing the upper limits of acceptable. Of course load time will vary depending upon your specific domain and the size of the objects. This blog is nearly two years old and only has a few hundred objects varying from 2k to 90k, some of my customers have been using their small apps for nearly a year and only accumulated 500 to 600 business objects averaging 0.5k each. Load time for apps this small is insignificant and using a relational database would be akin to using a sledge hammer to hang an index card with a thumb tack.

    API

    SandstoneDb has a very simple API for querying and iterating on the class side representing the repository for those instances:

    queries

    • #atId: (for fetching a record by its #id)
    • #atId:ifAbsent:
    • #do: (for iterating all records)
    • #find: (for finding first matching record)
    • #find:ifAbsent:
    • #find:ifPresent:
    • #findAll (for grabbing all records)
    • #findAll: (for finding all matching record)

    Being pretty much just variations of #select: and #detect:, little if any explanation is required for how to use these. The #find naming is to make it clear these queries could potentially be more expensive than just the standard #select: and #detect:.

    Though it's memory based now, I'm leaving open the option of future implementations that could be disk based allowing larger databases than will fit in memory; the same API should work regardless.

    There's an equally simple API for the instance side:

    Accessors that come in handy for all persistent objects.

    • #id (a UUID string in base 36)
    • #createdOn
    • #updatedOn
    • #version (useful in critical sections to validate you're working on the version you expect)
    • #indexString (all instance variable's asStrings as a single string for easy searching)

    Actions you can perform on a record.

    • #save (thread safe)
    • #save: (same as above but you can pass a block if you have other work you want done while the object is locked)
    • #critical: (grabs or creates a Monitor for thread safety)
    • #abortChanges (rollback to the last saved version)
    • #delete (thread safe)
    • #validate (for subclasses to override and throw exceptions to prevent saves)

    You can freely have records holding references to other records but a record must be saved before it can be referenced. If you attempted to save an object that references another record that answers true to #isNew, you'll get an exception. Saves are not cascaded, only the programmer can know the proper save order his object model requires. To do safe cascaded saves would require actual transactions. Saves are always explicit, if you didn't save it, it wasn't saved, there is no magic, and you should never be left scratching your wondering if your objects were saved or not.

    Events you can override to hook into a records life cycle.

    • #onBeforeFirstSave
    • #onAfterFirstSave
    • #onBeforeSave
    • #onAfterSave
    • #onBeforeDelete
    • #onAfterDelete

    Be careful with these, if an exception occurs you will prevent the life cycle from completing properly, but then again, that might be what you intend.

    A testing method you might find useful on occasion.

    • #isNew (answers true prior to the first successful save)

    Only subclass SDActiveRecord for aggregate roots where you need to be able to query for the object, for all other objects just use ordinary Smalltalk objects. You DO NOT need to make every one of your domain objects into ActiveRecords, this is not Ruby on Rails, choosing your model carefully gives you natural transaction boundaries since the save of a single ActiveRecord and all ordinary objects contained within is atomic and stored in a single file. There are no real transactions so you can not atomically save multiple ActiveRecords.

    A good example of an aggregate root object would an #Order class, while its #LineItem class just be an ordinary Smalltalk object. A #BlogPost is an aggregate root while a #BlogComment is an ordinary Smalltalk object. #Order and #BlogPost would be ActiveRecords. This allows you to query for #Order and #BlogPost but not for #LineItem and #BlogComment, which is as it should be, those items don't make much sense outside the context of their aggregate root and no other object in the system should be allowed to reference them directly, only aggregate roots can be referenced by other other objects.

    This of course means should you improperly reference say an #OrderItem from an object other than its parent #Order (which is the root of the file they're bother stored in), then you'll ultimately end up referencing a copy rather than the original because such a reference won't be able to maintain its identity after an image restart.

    In the real world, this is more than enough to write most applications. Transactions are a nice to have feature, they are not a must have feature and their value has been grossly oversold. Starbucks doesn't use a two phase commit, and it's good to remind yourself that the world chugs on anyway, mistakes are sometimes made and corrective actions are taken, but you don't need transactions to do useful work. MySql became the most popular open source database in existence long before they added transactions as a feature.

    Here are some examples of using an ActiveRecord...

    person := Person find: [ :e | e name = 'Joe' ].
    person save.
    person delete.
    user := User find: [ :e | e email = 'Joe@Schmoe.com' ] ifAbsent: [ User named: 'Joe' email: 'Joe@Schmoe.com' ].
    joe := Person atId: anId.
    managers := Employee findAll: [ :e | e subordinates notEmpty ].
    

    Concurrency is handled by calling either #save or #save: and it's entirely up to the programmer to put critical sections around the appropriate code. You are working on the same instances of these objects as other threads and you need to be aware of that to deal with concurrency correctly. You can wrap a #save: around any chunk of code to ensure you have a lock on that object like so...

    auction save:[ auction addBid: (Bid price: 30 dollars user: self session currentUser) ].
    

    While #critical: lets you decide when to call #save, in case you want other stuff inside the critical section of code to do something more complex than a simple implicit save. When you're working with multiple distributed systems, like a credit card processor, transactions don't really cut it anyway so you might do something like save the record, get the auth, and if successful, update the record again with the new auth...

    auction critical: [ 
        [ auction
            acceptBid: aBid;
        save;
        authorizeBuyerCC;
            save ] 
         on: Error do: [ :error | auction reopen; save ] ]
    

    That's about all there is to using it, there are some more things going on under the hood like crash recovery and startup but if you really want to know how that works, read the code. SandstoneDb is available on SqueakSource and is MIT licensed and makes a handy development and prototyping or small application database for Seaside. If you happen to use it and find any bugs or performance issues, please send me a test case and I'll see what I can do to correct it quickly.


              Simple Image Based Persistence in Squeak   

    One of the nicest things about prototyping in Smalltalk is that you can delay the need to hook up a database during much of your development, and if you're lucky, possibly even forever.

    It's a mistake to assume every application needs a relational database, or even a proper database at all. It's all too common for developers to wield a relational database as a golden hammer that solves all problems, but for many applications they introduce a level of complexity that can making development feel like wading through a pond full of molasses where you spend much of your time trying to keep the database schema and the object schema in sync. It kills both productivity and fun, and god dammit, programming should be fun!

    This is sometimes justified, but many times it's not. Many business applications and prototypes are built to replace manual processes using Email, Word, and Excel. Word and Excel by the way, aren't ACID compliant, don't support transactions, and manage to successfully run most small businesses. MySql became wildly popular long before it supported transactions, so it's pretty clear a wide range of apps just don't need that, no matter how much relational weenies say it's required.

    It shouldn't come as a surprise that one can take a single step up the complexity ladder, and build simple applications that aren't ACID compliant, don't support transactions, and manage to successfully run most small businesses better than Word and Excel while purposely not taking a further step and moving up to a real database which would introduce a level of complexity that might blow the budget and make the app infeasible.

    No object relational mapping layer (not even Rails and ActiveRecord) can match the simplicity, performance, and speed of development one can get just using plain old objects that are kept in memory all the time. Most small office apps with no more than a handful of users can easily fit everything into memory, this is the idea behind Prevayler.

    The basic idea is to use a command pattern to apply changes to your model, you can then log the commands, snapshot the model, and replay the log in case of a crash to bring the last snapshot up to date. Nice idea, if you're OK creating commands for every state changing action in your application and being careful with how you use timestamps so replaying the logs works properly. I'm not OK with that, it introduces a level of complexity that is overkill for many apps and is likely the reason more people don't use a Prevayler like approach.

    One might attempt to use the Smalltalk image itself as a database (and many try), but this is ripe with problems. My average image is well over 30 megs, saving it takes a bit of time, and saving it while processing HTTP requests risks all kinds of things going wrong as the image prepares for what is essentially a shutdown/restart cycle.

    Using a ReferenceStream to serialize objects to disk Prevayler style, but ignoring the command pattern part and just treating it more like crash proof image persistence is a viable option if your app won't ever have that much data. Rather than trying to minimize writes with commands, you just snapshot the entire model on every change. This isn't as crazy as it might sound, most apps just don't have that much data. This blog for example, a year and a half old, around 100 posts, 1500 comments, has a 2.1 megabyte MySql database, which would be much smaller as serialized objects.

    If you're going to have a lot of data, clearly this is a bad approach, but if you're already thinking about how to use the image for simple persistence because you know your data will fit in ram, here's how I do it.

    It only takes a few lines of code in a single abstract class that you can subclass for each project to make a Squeak image fairly robust and crash proof and more than capable enough to allow you just use the image, no database necessary. We'll start with a class...

    Object subclass: #SMFileDatabase
        instanceVariableNames: ''
        classVariableNames: ''
        poolDictionaries: ''
        category: 'SimpleFileDb'
    
    SMFileDatabase class
        instanceVariableNames: 'lock'
    

    All the methods that follow are class side methods. First, we'll need a method to fetch the directory where rolling snapshots are kept.

    backupDirectory
        ^ (FileDirectory default directoryNamed: self name) assureExistence.
    

    The approach I'm going to take is simple, a subclass will implement #repositories to return the root object that needs serialized, I just return an array containing the root collection of each domain class.

    repositories
        self subclassResponsibility
    

    The subclass will also implement #restoreRepositories: which will restore those repositories back to wherever they belong in the image for the application to use them.

    restoreRepositories: someRepositories
        self subclassResponsibility
    

    Should the image crash for any reason, I want the last backup will be fetched from disk and restored. So I need a method to detect the latest version of the backup file, which I'll stick a version number in when saving.

    lastBackupFile
        ^ self backupDirectory fileNames 
            detectMax: [:each | each name asInteger]
    

    Once I have the file name, I'll deserialize it with a read only reference stream (don't want to lock the file if I don't plan on editing it)

    lastBackup
        | lastBackup |
        lastBackup := self lastBackupFile.
        lastBackup ifNil: [ ^ nil ].
        ^ ReferenceStream 
            readOnlyFileNamed: (self backupDirectory fullNameFor: lastBackup)
            do: [ : f | f next ]
    

    This requires you extend ReferenceStream with #readOnlyFileNamed:do:, just steal the code from FileStream so nicely provided by Avi Bryant that encapsulates the #close of the streams behind #do:. Much nicer than having to remember to close your streams.

    Now I can provide a method to actually restore the latest backup. Later, I'll make sure this happens automatically.

    restoreLastBackup
        self lastBackup ifNotNilDo: [ : backup | self restoreRepositories: backup ]
    

    I like to keep around the last x number of snapshots to give me a warm fuzzy feeling that I can get old versions should something crazy happen. I'll provide a hook for an overridable default value in case I want to adjust this for different projects.

    defaultHistoryCount
        ^ 15
    

    Now, a quick method to trim the older versions so I'm not filling up the disk with data I don't need.

    trimBackups
        | entries versionsToKeep |
        versionsToKeep := self defaultHistoryCount.
        entries := self backupDirectory entries.
        entries size < versionsToKeep ifTrue: [ ^ self ].
        ((entries sortBy: [ : a : b | a first asInteger < b first asInteger ]) 
            allButLast: versionsToKeep) 
                do: [ : entry | self backupDirectory deleteFileNamed: entry first ]
    

    OK, I'm ready to actually serialize the data. I don't want multiple processes all trying to do this at the same time, so I'll wrap the save in a critical section, #trimBackups, figure out the next version number, and serialize the data (#newFileNamed:do: another stolen FileStream method), ensuring to #flush it to disk before continuing (don't want the OS doing any write caching).

    saveRepository
        | version |
        lock critical: 
            [ self trimBackups.
            version := self lastBackupFile 
                ifNil: [ 1 ]
                ifNotNil: [ self lastBackupFile name asInteger + 1 ].
            ReferenceStream 
                newFileNamed: (self backupDirectory fullPathFor: self name) , '.' , version asString
                do: [ : f | f nextPut: self repositories ; flush ] ]
    

    So far so good, let's automate it. I'll add a method to schedule the subclass to be added to the start up and shutdown sequence. You must call this for each subclass, not for this class itself.

    UPDATE: This method also initializes the lock and must be called prior to using #saveRepository, this seems cleaner.

    enablePersistence
        lock := Semaphore forMutualExclusion.
        Smalltalk addToStartUpList: self.
        Smalltalk addToShutDownList: self
    

    So on shutdown, if the image is actually going down, just save the current data to disk.

    shutDown: isGoingDown 
        isGoingDown ifTrue: [ self saveRepository ]
    

    And on startup we can #restoreLastBackup.

    startUp: isComingUp 
        isComingUp ifTrue: [ self restoreLastBackup ]
    

    Now, if you want a little extra snappiness and you're not worried about making the user wait for the flush to disk, I'll add little convience method for saving the repository on a background thread.

    takeSnapshot
        [self saveRepository] forkAt: Processor systemBackgroundPriority
            named: 'snapshot: ' , self class name
    

    And that's it, half a Prevayler and a more robust easy to use method that's a bit better than trying to shoehorn the image into being your database for those small projects where you really really don't want to bother with a real database (blogs, wikis, small apps, etc). Just sprinkle a few MyFileDbSubclass saveRepository or MyFileDbSubclass takeSnapshot's around your application whenever you feel it important, and you're done.

    Here's a file out if you just want the code fast, SMFileDatabase.st


              FAQ   
    Akismet checks your comments and contact form submissions against our global database of spam to protect you and your site from malicious content.
              Global Planning Content Launch Manager - HP - Houston, TX   
    Expertise and knowledge in the online space (web, email, search, database marketing, chat marketing, podcasting, blogging, privacy, e-business, etc), including...
    From HP - Sat, 01 Jul 2017 11:35:09 GMT - View all Houston, TX jobs
              グラベルロードより良いかも!スペシャライズド(SPECIALIZED)のエントリー エンデュランス ロードバイク「アレー(Allez)」がお勧め!   

    こんにちは、なりなりです(^ ^)

    僕自身は

    グラベルロードなら林道・砂利道・ロングツーリングもバッチリ!10万円台の2016年お勧めモデルを徹底比較! - なりなり日記

    GIANTのグラベルロード REVOLT 1に乗っていて、まずまず気に入っていますが

    • 林道は思ったより自宅から遠くて片道70〜80km、そこまで行ってから急坂を登るのは、なかなかキツイのであまり活躍の機会が無い
    • 重量10kg超えは坂道にはキツイかも
    • ディスクブレーキがブブーッと鳴くのがウザイ

    などと、グラベルロードでなくても良かったかも、なんて少しだけ思ってます(^◇^;)


    このところエントリーユーザー向けのエンデュランス系アルミロードが見直されていて

    選べるアルミ ロードバイク トレック | Trek Bikes (JP)

    TREKのÉmonda ALRやDomane ALRも良いなと思ったのですが


    https://www.specialized.com/ja/ja/bikes/road/allezAllez | Specialized

    スペシャライズド(SPECIALIZED)のアレー(Allez)は

    ■超軽量: 先代機と比べ、フォークとフレームで450gの軽量化に成功。ヒルクライムやレースでも軽さはパフォーマンスを高めると同時に自信にもつながる。階段など都市生活でバイクを担ぐときにもうれしいポイント。

    車体重量が不明なのですが、さすがに10kgは切っているでしょう。TREK Émonda ALR4なみに9kg切っているのでしょうか?

    ■ハンドリング性能:より軽く、強くなったヘッド周り。E5アルミ製のテーパーヘッドチューブと、専用設計のフルカーボンフォークは、乗ればすぐに感じられるほど鋭く、正確かつスムーズなハンドリングを実現。

    フルカーボンフォークは振動を吸収してくれていいです(^o^)

    ■ジオメトリー:Retul Fit Databaseに蓄積された全世界のアレー・ライダーのデータをもとに、彼らの要求に高水準で合致した新ジオメトリー。上位機種「ルーベ」に近づいており、より多くのライダーのニーズに応えてシャープな運動性能を発揮。


    ■多様性:700×28cまでの太いタイヤ・クリアランスを持ち、ラックやフェンダーも装着可能。ライダーの可能性を拡げ、理想のライドを実現する機能を搭載し、パフォーマンスを損ねることなく実装。

    ■高い完成度:整備性の高いインナーケーブルルーティングを採用。美しく処理されたワイヤー導入口は機能美を追求した結果。高級感あふれるスムースな仕上がりとグラフィックも、高い運動性能とあいまって見る者を魅了。


    残念ながら軽量といっても完成車重量が分からないのですが、

    • 28cタイヤ対応
    • ラック、フェンダー取付け可能ディスク
    • フルカーボンフォーク

    とグラベルロードに限りなく近くて、大抵は必要性の低い30c以上の太いタイヤやディスクブレーキに非対応な代わりに軽量化されていそうなのがグッド!


    Allezにはコンポーネントが105のEliteとSORAのSportが有ります。

    Allez Elite

    ALLEZ ELITE BLK/WHT 49(49 サテンブラック/ホワイト/クリーン): BIKE|スポーツバイク用品の通販サイト|スペシャライズド・ジャパン

    希望小売価格 16万2,000円と、このバイクの購入を検討するであろうエントリーユーザーからするとギリギリの高額ラインではないでしょうか。

    その代わり、シフトレバー、フロント・リアディレイラー、カセットスプロケットはシマノ105となっており、このクラスのバイクとしては高級な部類のコンポーネントが使われています。

    タイヤは25cと、エンデュランス系ロードの標準的は太さですが、28cまで対応です。

    28cは僕のクロスバイク、GIANT ESCAPE R3でも標準装備されており、十分余裕の太さです。


    カラーはロードバイク、特に105クラスとしては豊富な4色。

    無難なブラック/ホワイト(BLK/WHT)


    ビビッドなグロスチームイエロー/ターマックブラック(TEAMYEL/TARBLK)


    爽やかなライトブルー/ロケットレッド(LTBLU/RKTRED)


    個性的なサテンクールグレー/グロスホットピンク(CLGRY/HTPNK)


    中々悩ましいですね。

    僕のようなオヤジにはピンクはキツイか(^◇^;)

    Allez Sport

    ALLEZ SPORT CSMWHT/BLK 49(49 グロスコスミックホワイト/サテンブラック): BIKE|スポーツバイク用品の通販サイト|スペシャライズド・ジャパン

    希望小売価格 10万8,000円と、エントリーユーザーには魅力的な価格です。

    シフトレバー、フロント・リアディレイラー、カセットスプロケットはシマノ ソラ(SORA)となっており、これは最下位グレードのクラリス(CLARIS)の1つ上です。

    SORAは去年、CLARISも今年のモデルからシフターのワイヤーが触覚の様に飛び出た構造ではなくなって、上位グレードとぱっと見の違いが無くなったのが嬉しいです。

    CLARISがリア8速、SORAが9速、TIAGRAが10速、105が11速となっており、105と比べるとギア設定がやや荒いですが、TIAGRAとSORAは大差無いです。

    その他のパーツも異なりますが、フルカーボンフォークとタイヤは共通です。

    こちらは2色展開で


    無難なグロスコスミックホワイト/サテンブラック(CSMWHT/BLK )


    中々個性的なサテンネイビー/グロスノルディックレッド(NVY/NRDCRED )

    サテンネイビーいいね!


    Eliteも魅力的ですが、取り敢えずSportで十分かも。

    僕も今年購入するなら、これにする気がします。


              New Post: New to XERP   
    Hi,
    I am new to XERP, I created a database and run the script to create the tables, where I can find an explanation about each of the tables and their relationships with others. for example, Why there are Companies, CompanyType and CompányCodes tables?, what is the relationship between them?, and Why this pattern is repeated in other tables, such as Departments, and DepartmentTypes DepartmentCodes?

    Thanks in advance

    Jose Patron

              How to Use PR to Boost Your Content Marketing Program   
    Now what? Your company writes a blog. Delivers a webinar. Creates an e-book and shares it on social media, the company website and through the customer database. Your organization watches for engagement and the sales team waits for the pipeline to fill.
              Get-SQLServer2.ps1   
    1. #Requires -Version 3.0 
    2. <# 
    3. .SYNOPSIS 
    4.     This script Gets a list of SQL Severs on the Subnet 
    5. .DESCRIPTION 
    6.     This script uses SMO to Find all the local SQL Servers  
    7.     and displays them 
    8.  
    9. .NOTES 
    10.     File Name  : Get-SQLServer2.ps1 
    11.     Author     : Thomas Lee - tfl@psp.co.uk 
    12.     Requires   : PowerShell Version 3.0 
    13. .LINK 
    14.     This script posted to: 
    15.         http://www.pshscripts.blogspot.com 
    16. .EXAMPLE 
    17.     PS>  # On a Lync Server looking at Lync Implementation 
    18.     PS>  Get-SQLServer2 
    19.     There are 7 SQL Server(s) on the Local Subnet 
    20.  
    21.     ServerName      InstanceName Version      
    22.     ----------      ------------ -------      
    23.     2013-LYNC-MGT   MON          10.50.2500.0 
    24.     2013-LYNC-MGT   SCOM         10.50.2500.0 
    25.     2013-TS         RTCLOCAL     11.0.2100.60 
    26.     2013-SHAREPOINT SPSDB        11.0.3000.0  
    27.     2013-LYNC-FE    RTC          11.0.2100.60 
    28.     2013-LYNC-FE    RTCLOCAL     11.0.2100.60 
    29.     2013-LYNC-FE    LYNCLOCAL    11.0.2100.60 
    30.      
    31. #> 
    32. Import-Module SQLPS 
    33.  
    34. # Now get all the database servers on the local subnet 
    35.  
    36. $SQLservers = [System.Data.Sql.SqlDataSourceEnumerator]::Instance.GetDataSources() 
    37. $Srvs= @() 
    38.  
    39. # Convert collection to an array 
    40. Foreach ($srv in $SQLservers) { 
    41. $srvs += $srv 
    42.  
    43. # Now display results 
    44. If ($Srvs.count -LE 0) { 
    45. "There are no SQL Servers on the Local Subnet" 
    46. return
    47.  
    48. # Now print server details 
    49. "There are {0} SQL Server(s) on the Local Subnet" -f $Srvs.count 
    50. $Srvs | Select ServerName, InstanceName, Version | Format-Table -AutoSize 

              Systems Analyst (Teradata Database Developer) - McCain Foods (Canada) - Florenceville, NB   
    McCain Foods is seeking a Systems Analyst, specialized in Teradata database development, to contribute to the success of our Enterprise Data Warehouse (EDW)...
    From McCain - Tue, 04 Apr 2017 12:46:27 GMT - View all Florenceville, NB jobs
              Systems Analyst (Teradata Database Developer) - McCain Foods (Canada) - Florenceville, NB   
    The company’s products can be found in thousands of restaurants and supermarket freezers in more than 160 countries around the world....
    From McCain - Tue, 04 Apr 2017 12:46:27 GMT - View all Florenceville, NB jobs
              Re: Timeouts on SimpleDB requests.   
    I often have timeouts when querying simpledb - no response is coming for 20 secs and even for 200 secs. I don't know how to solve this. The database I operate is very small, just several thousand records in one table.
    ...
              Adobe Workflow Technical Editor (Freelance) - Ontario Basketball - Ontario   
    Solid understanding of how databases are structured. As part of the Media Assets team, you will take on the responsibilities related to designing, implementing...
    From Ontario Basketball - Thu, 22 Jun 2017 21:21:36 GMT - View all Ontario jobs
              Oracle Database Administrator / Performance Tuning Specialist - RPM Technologies - Toronto, ON   
    *Job Summary* The Oracle Database Administrator / Performance Tuning Specialist is responsible for the maintenance and implementation of database changes for
    From Indeed - Thu, 29 Jun 2017 16:24:19 GMT - View all Toronto, ON jobs
              DBA2 DBA - Sky System Inc - Toronto, ON   
    *JOB RESPONSIBILITIES: * As a DB2 Database Administrator II, you will provide required support for business applications using DB2 databases. As part of a
    From Indeed - Fri, 02 Jun 2017 20:38:07 GMT - View all Toronto, ON jobs
              Oracle DBA / Performance Specialist - RPM Technologies - Toronto, ON   
    The Oracle Database Administrator / Performance Tuning Specialist is responsible for the maintenance and implementation of database changes for our
    From RPM Technologies - Tue, 09 May 2017 22:27:01 GMT - View all Toronto, ON jobs
              Oracle Database Administrator - RPM Technologies - Toronto, ON   
    *About RPM* RPM Technologies provides software solutions and services to the largest financial services companies in Canada. We offer product record keeping
    From Indeed - Mon, 08 May 2017 20:40:57 GMT - View all Toronto, ON jobs
              What to say.....?   
    I don't know exactly what to say, but I figure it's been long enough, that if anyone even still bothers to check, I should write something. So let's cover the basics: work, school, SCA and home life.

    Work has picked up. They are finally starting to utilize me, and that feels good. I still have an amazing amount of do nothing time, but that's more my choice than theirs as there are projects I could be working on, but I'm avoiding them. We've been building up to a major server redo (bringing a server up from Windows 2000 to Windows 2003 and SQL Server 2000 to SQL Server 2005). That started today and will take most of this week (and most of Saturday) to get worked out and cut back over. So far it's going well (knock on wood), but I'm still waiting for the other shoe to fall. I've read several articles that have suggested waiting until the next full release of SQL before upgrading, but the powers that be are worried about Microsoft discontinuing support of SQL 2000 sometime this year so we are going ahead with the upgrade. It's only one of our servers, but it's a major one, and we're using this time to also test our Disaster Recovery methods (databases being passed to a virtualized server from backups and changing ODBC drivers on client machines, it's a relatively painless solution).

    School. Well, what can I say about school? I'm down to two requirements. I need to work my way through an individualized instruction of a web development class, and I need to do my thesis. The web class is killing me. It's something I've never done. It was the one thing that I was looking forward to learning in the whole program, and now I have to take it as an independent study! I think I really would have preferred a classroom environment on this one. I suck at the web! I always really appreciated the talent that the guys at my old job had for putting together websites. They had an artistic eye, and they knew how to make the tags do what they wanted them to. I, decidedly, do NOT have an artistic eye, and I can't make the damn tags do anything I want them to. To top it off, I'm basically trying to learn all this stuff on my own. Thanks to one friend, he pointed me to www.w3schools.com which has been a wonderful resource. If I've learned anything, it's thanks to that site. As for my thesis, well, I've had some preliminary discussions via email with the professor who I would like to be my thesis advisor, and I'm going to try and have a literary review completed prior to the end of February. I'm not sure whether or not I can get the thesis done by the end of the semester, but I'm going to try.

    SCA. Yeah, more of the same. Last Sunday was a practice. Now, I've been riding the exercise bike in the basement three to four times a week since the begining of the year. I was hoping that my stamina would have increased, but it didn't. I got the chance to honestly teach for a little while at the begining of practice. I showed someone the basics of the "wrap" shot. Imagine throwing a sword blow at someone's leg, then at the last moment, you turn your wrist and rotate the sword so that you hit them with the back blade instead of the front of the blade. What you wind up doing is hitting the person in the back of the leg with the sword instead of the front or side. That's the basic of a "wrap". There are several variations, and I showed someone most of what I knew about them. There were three Knights standing there, watching me teach this person, and they didn't interupt to say that I was wrong on anything, and one of them actually said that I was a good teacher. I was pretty happy about that.

    The first half of practice I faced some of the same people I always face. I wasn't happy with my performance at all. Halfway through practice though, something interesting happened. A group of 5 or 6 fighters from a neighboring barony showed up. I actually got a chance to fight some people who I don't fight all the time. I did fairly well against them. One thing I did wrong though. I faced their toughest competitor last. I was completely out of gas when I was fighting him, and he tore me up hard. They said that they were planning on coming out again sometime. I'll have to make sure I fight that guy first. They were all very friendly, and seemed like great people to have around. I look forward to crossing swords again.

    On a completely different topic related to the SCA, I've had another example of how me saying "I suck" (which I do) can effect other people. On one of the online forums I read, someone asked the question "What are your goals". One of the Knights responded that he wanted to

    "Each day I ask that I can keep
    fooling my squires and everyone else
    for just one more day. "


    Now, this is a Knight who I respect greatly. His presentation on the field is perfect. He is humble. He is witty. He is soft spoken. He is honorable. This Knight is the example in our Kingdom for what it is to be Knightly. When I read that he thought he needed to fool people, I nearly screamed. I typed a response to the forum, but I thought that would be too public. I began to type an email response to him personally, but I shouldn't question his own opinion of himself, but I was honestly mad at him for not seeing in himself, what everyone else sees every time he takes the field or talks to you (then another light bulb went off....hmmmm....maybe I might be a wee bit too hard on myself as well). I'm not sure if he drinks, but I owe him a beer. I want to sit by a campfire and tell him all the wonderful images and stories I know of him. I want to reassure him of what I and many others feel. That he is truly a Knight.

    I know this is getting long, but bear with me. One or two more stories to tell about the SCA. Twice within the last month, I have been complimented by people in a way that touched me deeply. I was riding in a car with one of our "newer" fighters (he's been fighting a year and a half, and he is far from a newbie). I stated that I would like to get Knighted someday. He said that he would like to see that. I asked him what it mattered to him. He told me that I was one of the people who represented the best of what a knight should be (I'm paraphrasing here). He said that I was kind with everyone. I was generous with everyone. I was willing to train anyone. He said that I had impressed him and that he was trying to follow MY example in how to deal with people. He has no idea how much that touched me.

    The other story, was completely unexpected. On that same online forum I mentioned earlier, a user from my Kingdom posted a question. He asked, in each Kingdom, who are the unbelts that best represent Knighthood without considering prowess. In other words, who are the guys that if they could get their prowess up to snuff would make excellent Knights. Several names, from several kingdoms were mentioned and discussed, and no one from our Kingdom responded. The gentleman who originally posted the question responded that he was surprised that no one from our Kingdom responded, so he would let people know who he was thinking of when he asked the question. He said that the one person that came to mind for him from our Kingdom was me. I was literally blown away! This is a guy who I spent some time with maybe 15 years ago back when I was in college. He lives on the other side of the Kingdom from me, and I haven't had any consistent contact with him since then. I've seen him on the side of the field, and I've always been pleasant, but I can't for the life of me figure out how I made that impression on him. I stood a little taller that day. I owe him some scotch (I'm pretty sure he prefers it to beer). I guess, in one way or another, I've made an impression on some people. The lesson to take away from all of this is that maybe, just maybe I could be a Knight if I could get my stuff together on the fighting side of life.

    Finally, the home life. My daughter is now almost 17 months old (I can't believe that). She's walking all over the place. She's climbing stairs. She's exerting her independence (this is a bad sign, she's not even two and she wants to be independent). She's also clumsy as all get out. She is constantly falling face first into our hardwood floors. She currently has two nice sized goose eggs on her forehead. I know this is a phase, and that she will eventually find her balance and grace (God I hope so, I really don't want people thinking I beat my kid!). I've said it before, and I'll say it again. I don't care how bad a day I've had, if I come home and see her smile, and hear her giggle, the rest of the world just doesn't matter. Two weekends ago, we were at a baby shower for some friends of ours who are adopting a child. It was more of a big party than a baby shower. They held it in a fire hall. Anyway, at one point, my daughter was overly tired. I sat Indian style on the floor, wrapped her in a blanket, put her in my lap and gently rocked her. It was a cold concrete floor, and my legs soon fell asleep. The pins and needles were killing me, but if my daughter was going to let me hold her there, I wasn't moving for anything. I wouldn't give that moment up. Those moments make me smile and warm my heart.

    Sorry it's such a long post. I should really write more often (and more focused). If you've read this far, then you're crazy. I've come to realize that this blog is as much for me as it is for anyone else. It puts some sign posts in my mind that can direct me back to fond memories when I go back and read. Thanks for taking an interest in my memories.

              There aren't enough hours in the day...   
    It's been too long.

    A lot has changed.

    I'm not sure I know where to begin.

    So, I've gotten a new job at a social services organization that tends to MH/MR children (and adults, but they focus on the kids). I'm a DBA working on their SQL Server databases. It is worlds different from what I was doing as a network admin / troubleshooter at the old place. There's a lot more expected of me, and I'm not entirely sure I have all the skills they are looking for, but I'm trying and it's been good to get back to thinking in SQL. I've actually done several things so far that have been really interesting (to me anyway) and there's a whole lot more to learn. It's definitely better for my resume', but it doesn't seem like it's as friendly a place as I was working. That could be because I've been there less than a month, but I just don't feel that I've "connected" with anyone. There's 3 onsite eateries for the staff to choose from and everyday at lunch, I find myself eating alone. It's very strange.

    In addition to changing jobs, schooling has gotten "interesting" as well. It seems that due to lack of enrollment, the college is cancelling my major. They are committed to letting existing students finish, but that means that I have to try and take all of my remaining classes NOW! This semester, I have three classes. The workload has been very daunting. I'm 4 weeks into the semester and I haven't died yet, but I'm really starting to burn out. It is getting harder and harder to motivate myself to do the homework. I keep telling myself that I'm a quarter of the way through the semester, and it's only 11 weeks left. I can do anything for 11 weeks, right? I'm afraid that the 4.0 that I've been carrying until this point may be in jeopardy. Wish me luck.

    Two weeks ago, there was a fighting event that is one of my standard events for the year. Before the tournament started, there was nobody in the lists (the fighting ring). So I armoured up and went out and stood the list basically challenging anyone there if they wanted to come and fight me before the tournament started. The Knights of our Kingdom have been complaining that the unbelts just don't show a fire to fight anymore. No one seems to be fighting pickup bouts outside of the posted tournaments, and they were really lamenting the fact that it seemed nobody cared. Well, I take this as a personal challenge. At every event I go to now, I am going to armour up early and take the field whether there's a tournament or not.

    Well, I stepped out into the list to see who would come out and fight me. I looked around and three of the Knights were standing and talking on the sidelines. When one of them finally saw me standing in the list, he slapped the other two on the shoulder, pointed at me and all three of them started racing to see who could get into armour to beat me the fastest. It was a great feeling that I had motivated some Knights into action. I had three warm up bouts in all. In two of the fights I thought I was doing fine. In the third, well, not so much. The Knight I was facing was a younger Knight. He's also training for Crown Tourney in two weeks. I couldn't hit him to save my life. Worse yet, I couldn't even make him break a sweat. He could beat me at will. Effortlessly! He seemed bored fighting me. This pretty much set the tone for me for the rest of the day. Suffice to say, the day didn't get any better. I made my point by stepping into the list before anyone else, but my fighting was uninspired and I did not show well (or at least I didn't think so). To top it off, my elbows have hurt ever since that day. I'm pretty sure that it's just a case of "Tennis Elbow", but it means I haven't fought since that day. It hurts to even pick up my sword. This is bad as Crown Tourney is in two weeks, and if I can't get in the practice time, I run the risk of embarrassing myself and more importantly embarrassing my lady wife and my child. I can't stand for that. I might have to think about withdrawing from Crown, but I'm not sure I really like that idea either. I guess we'll have to see.

    My wife surprised me today and painted my fighting shield with my heraldry. It's basically a graphical device that represents who you are in the Society. I've been fighting with just a basic white shield for the past year and a half. I didn't know how much I had missed having my colors displayed on my shield until I saw my shield painted again this evening. She truly does support me in all that I do (fighting, school, etc.). I'm not sure what I would do without her. We're still on completely opposite schedules and we almost never see each other. She's still working third shift so she can watch the baby during the day and we don't have to pay daycare fees. If we are lucky, we see each other for maybe an hour a day during the week. That will get better when school is done. That's another reason why this semester is killing me, and I can't wait for it to be over.

    My daughter just turned one year old. She's crawling all over the place. She's feeding herself now, and she's standing on her own. I can't believe it's been a year. She has moved from being a helpless little baby, into being a little person. She amazes me every day. Somehow or another, I have to figure out how to give her a good life. She deserves to be happy, and I need to find a way to give that to her. I guess only time will tell on that one.

    Well, I think I've rambled on for long enough at this point. There's a lot going on right now. New job, more classes, problems with fighting. All in all it's a good life, there's just ways that it could be better. There aren't enough hours in the day....

              People's Compassion is Boundless.   
    So it's been a long morning.

    It started much like any other. I had to install a new PC under someone's desk. In the process of trying to get the Machine moved into place a plugged in, I managed to bounce my head off of one of the cubicle desk supports and into the edge of the computer I was trying to install. At this point, I'm pretty sure that I've given myself a concussion. As I'm rolling on the floor swearing and cursing in pain, there are 4 guys sitting at their desks not 3 feet away, and not one of them asks what's wrong or how I'm doing. I put my hand to my forehead and feel the familiar warm liquid that tells me I've split myself open. I stand up, walk across the hall and inform HR that I have injured myself (proudly wearing my crimson mask) before I go to the restroom to get a wet paper towel and assess the damage. Turns out I've opened up a 3 to 4 inch cut on my forehead. It's not a gash, just a cut, but it is bleeding profusely. I apply pressure with the wet paper towel and go back to HR's office to now give her the full details. After a brief discussion and an accident report, I go back to trying to plug in a PC while holding a wet paper towel to my head (mind you the 4 guys still haven't even asked if I'm okay). As I finish setting up the PC, my cell rings.

    "Help Desk", I say

    "Can you reset my email session, I'm locked out"

    "Sure," I respond "but it may be a few minutes as I'm bleeding from my forehead."

    "Well, how long will it be? It's kind of an emergency for me to get into my email."

    "I'm not kidding, I'm bleeding from my forehead. I'll get to it as soon as the bleeding stops"

    "How long do you think that will take? I really need to get into my email."

    At no point did the person on the other end of the line ask what happened or how I was doing. They simply wanted their email account reset and didn't really care what was going on in my life. I hung up and went back to my desk to reset the aforementioned email. As I was resetting the email, someone comes into my cube. At this point I should mention that the bleeding has still not stopped and I'm still sitting with my hand holding a blood red piece of paper towel to my head to apply pressure. Without commenting on the blood, this guy starts telling me a horrible story about how his PC is locking up and he can't log into our database. He doesn't ask how I'm doing or what happened, he simply wants his PC fixed and fixed NOW!

    So, I head over to his PC (still holding the bloody paper towel to my head) and start looking at his PC. The head ache is killing me and I can't say my vision is particularly clear, but I'm doing the best I can. As I'm looking at the PC, my cell rings yet again.

    "Hi, remember that problem we were having yesterday?"

    "Yes", I respond trying to use short words and keep my cool.

    "Well, we're still having it"

    "Okay, well right now I'M BLEEDING FROM MY HEAD!!!!! I will try to look at your problem once the bleeding stops."

    "It's just that I'm going away on vacation for two weeks starting tomorrow and this needs to be resolved before I leave."

    "I understand that, and I will be happy to look into your problem ONCE THE BLEEDING STOPS!"

    "Well, it's just really important that I get this resolved"

    "I know. I'm sure I'll be able to look at it later this morning, or perhaps early this afternoon."

    At no point did this caller even miss a beat when I told him I was bleeding. He didn't ask if I was okay, he didn't ask what had happened. He simply kept telling me how important his problem was. The compassion of people amazes me.

    P.S. - The bleeding has now stopped, but the cut is still oozing a little. I have a major headache and don't really feel like dealing with other peoples' problems right now. Thank you for asking!
              DBA SQL server 2016 - CGI - Lac-Saint-Jean, QC   
    Be able to build and manage an optimal database schema; Within the digital transformation team, the SQL 2016 DBA will structure the MS SQL 2016 databases....
    From CGI - Mon, 05 Jun 2017 18:35:31 GMT - View all Lac-Saint-Jean, QC jobs
              Database & Data Management Account Executive - SAP - Palo Alto, CA   
    Possess hands-on knowledge of SAP, Sybase, Oracle, IBM, Microsoft, Teradata, Informatica, MapR, Cloudera, Hortonworks or other associated database and data...
    From SAP - Tue, 13 Jun 2017 02:43:05 GMT - View all Palo Alto, CA jobs
              Membership Database Developer - First Baptist Dallas Church - Dallas, TX   
    Perform all other duties and responsibilities as assigned by the Controller. Candidates MUST have a growing relationship with Jesus Christ and will be asked...
    From Indeed - Fri, 12 May 2017 15:57:30 GMT - View all Dallas, TX jobs
              Senior Platform Engineer - eTrigue - San Jose, CA   
    Extensive experience developing multi-threaded Java based services. Monitor evolving Java Framework, SQL and noSQL database trends and technologies for possible...
    From eTrigue - Sat, 06 May 2017 10:55:05 GMT - View all San Jose, CA jobs
              Comment on California’s Gun Control Efforts Suffer Two Legal Setbacks by DarkSoul Racing   
    You know what else the citizens of California voted on and PASSED, was Prop 8 banning gay marriage, and the courts also found that to be a stupid law that failed constitutional muster and was subsequently repealed. These antigun laws are purely political currency being used by our state legislators to only push their agenda of simply getting re-elected. I genuinely believe that anyone that does a little research on the topic will realize that, in the most simplest of terms, criminals do not follow laws. You can ban certain guns based on as arbitrary things such as color (which they are), you can ban magazines based on the amount of bullets they hold, or ban a certain class of rifle because they have aesthetic features that have no performance value but look "scary". None of these laws change anything, criminals will still do bad things. Violence against another (assault) is still illegal, murder is illegal, robbery is illegal, the mechanism of these acts is sort of irrelevant. In 2006, over 10000 people died as the result of driving while impaired (drivers and victims) where's the outrage? Where is the call for banning cars? and the comedy there, is driving is a privledge, not a constitutionally guaranteed right. REALITY is, is that gun violence is at a historic all time low nationally, and if you don't run in violent circles, I.e. you're not a gang banger, and if you are mentally stable, and don't plan on committing suicide, your chances of being a victim of gun violence are approaching zero. And before you huff and puff cause this hurts your feelings, I encourage you to do 2 things, first, do some factual research, the FBI database compiles all this information, I think you will be surprised to find out that more people are killed a year by fists and hammers, over guns, and second, find someone local to take you to a shooting range, and experience shooting and the law abiding culture of gun owners before you judge something you may know absolutely nothing about.
              Lenovo Mobile E7000 USB Driver/PC Suite Free Download For Windows   
     lenovo-a7000-usb-driver-pc-suite
    Hello, users. Today, we are here with you again to share something with you. This is the fully supportive service for all the users to solve their computer connectivity problems. Every day, we are sharing at least one USB Driver or a PC Suite for your computer Windows operating system. These both USB Driver and a PC Suite are the software which is helping us to connect our mobile devices to the computer without questions. In this post, we are going to share Lenovo A7000 Mobile USB Driver and a PC Suite software for all supporting operating systems. Universal Serial Bus is the most important software which can help us to connect our device to the computer via a compatible USB data cable connection. Just read few simple steps to connect your device to the computer easily in below.

    • Before doing anything. Find an official USB Driver or a PC Suite for your computer Windows OS.
    • Then use the setup file.exe and install it on your PC Windows.
    • After installing it on your system. Attach your device with a compatible USB data cable to your computer. 
    • The driver will help you to connect your device without facing problems. 
    • Must check the data cable socket before the connecting of your device.
    • You have done your job.
    Now, to download the USB Driver and PC Suite for your Lenovo A7000. Follow the link URL below and click to start your download immediately. It's free and available to download below.
    Description: Lenovo A7000 USB Driver/PC Suite Free Download
    Supporting OS: Windows 32-Bit/64-Bit
    Download (USB Driver)
    Download (PC Suite)

              Profiling in Your Pocket   

    Last month, the National Sheriffs’ Association unveiled the latest update to its BlackBox Digital Witness app, which allows users to report suspicious activity to law enforcement: an anti-terrorism feature. The sheriffs’ association developed the feature with the Department of Homeland Security and the National Fusion Center Association after the 2015 mass shooting in San Bernardino, California. According to the association’s executive director and CEO, Jonathan Thompson, the fight to stop “homegrown extremist[s] ... requires we work with our citizens and provide them with new tools to help in the fight against crime and to protect their families and schools.”

    The updated app sends another message as well: that sheriffs are authorizing profiling on an alarming scale.

    When BlackBox first launched in 2013, it served primarily as an emergency alert system. Users could record videos of crimes in real time and notify their personal contacts if they were in immediate danger. While neighborhood watch groups used the app to monitor suspicious activity and see where incidents were frequently reported, the program did not alert local police. The version of the app released last month, however, gives users a direct line to law enforcement, allowing them to send participating agencies recordings of possible threats. Police officials can also use the app’s GPS feature to locate suspects and respond to reports of suspicious activity.

    BlackBox Digital Witness is not yet a runaway hit. According to Edward Horcasitas, the app’s creator and a technical adviser to the sheriffs’ association, it has approximately 15,000 users who have recorded roughly 18,000 videos. While Horcasitas said users have reported a wide range of crimes, including domestic violence and aggravated robbery, the app’s online reviews mostly lay out hypothetical use cases. “Been watching a young man pick up and throw a beautiful Doberman puppy every day for the past few weeks. Now I can document his abuse and save this dog’s life,” wrote one user.

    At least thus far, law enforcement agencies have been more excited than the general public about the app’s potential. After the latest version was released in May, nearly 70 agencies committed to using BlackBox Digital Witness. The sheriff of Florida’s Orange County told the Orlando Sentinel that the app will allow to users to “provide information to law enforcement so that we can mine the data and make a determination of whether or not it’s useful.” An officer using the tool in Muskogee, Oklahoma, told local reporters the app makes police officers safer because they can witness incidents as they happen. He also said it will make prosecution easier by providing video evidence to support a case. (Although he did not provide specifics, Horcasitas says he knows of several cases in California that have been prosecuted based on evidence provided via the app, including an animal abuse case.)

    While the app itself represents a potentially dangerous incursion on civil liberties, what’s more disturbing is what it reveals about the priorities of the National Sheriffs’ Association and the Department of Homeland Security. National Sheriffs’ Association President Greg Champagne said in February that the organization agrees with Donald Trump’s approach to enhancing national security, particularly when it comes to clamping down on immigration. “We have to give our president the benefit of the doubt,” he said, pointing out that Trump was the first in “quite a while” to invite the association’s leadership to the White House.

    The app’s new terrorism feature comes at a time of heightened scrutiny of Muslim Americans and immigrants, scrutiny that’s been sanctioned by President Trump, the Justice Department, and DHS. In that Orlando Sentinel story, the Orange County sheriff encouraged civilians to report on “individuals who indicate that they’ve been self radicalized by the way they communicate with others either through social media, emails or other communications [and] by their behaviors sometime in their neighborhoods [and] statements that they make.” This is a recipe for rampant profiling.

    BlackBox urges its users to surveil anyone they consider a “terrorist,” a dangerously vague concept. Colloquially, government officials and media outlets use the term to describe Muslim extremists as opposed to white supremacists or mass shooters. DHS also now considers some Black Lives Matter protesters terrorists. Even if local agencies choose not to follow up on a report, the data is stored forever and can potentially be used to conduct long-term surveillance on anyone who’s considered a threat, not just people who exhibit dangerous behavior. According to an investigation by the Intercept, people who are mislabeled as terrorists may be added to a no-fly list or end up in jail. Family members and acquaintances are sometimes identified as possible terrorists by proxy. Even with such high stakes, Horcasitas says the app includes no built-in features to handle false reporting or abuse.

    The BlackBox app is the latest iteration of the post-9/11 “See Something, Say Something” campaign, which started in New York City and has since been adopted by DHS and the Transportation Security Agency. Ten years after its adoption by New York’s Metropolitan Transportation Authority, there was no indication that the campaign successfully derailed a terror threat. What we do know is that apps and websites for reporting crime and suspicious activity to law enforcement have consistently led to the profiling of marginalized communities.

    Locals have used the French Quarter Task Force app in New Orleans to report suspicious people to law enforcement, which has sometimes devolved into criminalizing homeless people and people of color. In Washington, D.C.’s affluent, predominantly white Georgetown neighborhood, a local shoplifting prevention app became a mechanism for monitoring black shoppers. According to the Washington Post, around 70 percent of people flagged through the app were black, even though black residents make up less than 4 percent of Georgetown’s population. Research by Rutgers University professor Jerome Williams indicates black people are no more likely than whites to shoplift but are more likely to be reported to the authorities. Earlier this year, Wired also reported on a web portal called the Thin Blue Line Project, which was supposed to use GPS to track Muslim “threats.” According to the Southern Poverty Law Center, it actually monitored harmless civilians, including mosque worshippers and Muslim student organizations.

    Although the French Quarter app is still operational, the Thin Blue Line Project and the Georgetown app are both now defunct. While there is very little evidence that crowdsourced crime-fighting apps are effective, many police departments remain optimistic about their power to prevent criminal acts. If the growing support for BlackBox in the law enforcement community is any indication, anti-terrorism apps won’t be going away. In an era of heightened racism and surveillance, these tools seem more likely to ruin lives than to save them.


              Oracle Database Administrator - Splice - Ontario   
    Are you looking to advance your consulting career by adding more project experience do your profile? Are you a passionate Database Administrator (DBAs) that...
    From Splice - Mon, 19 Jun 2017 11:47:07 GMT - View all Ontario jobs
              What is CPanel Hosting? Cpanel Kya Hota hai?   
    What is Cpanel Hosting, Cpanel in WordPress, Kya kabhi aapke man me ye sawal aaya hai ki Aakhir Cpanel kya hai.

    Jab bhi hum apni site banate hai to hum isko hosting par host karate hai tab hume capnel ki jarurat padati hai. Aakhir ye cpanel hota kya hai?

    What is CPanel?

     Cpanel ye online Linux based Web hosting control panel hai. Cpanel hume graphical user interface aur automation tools provide karata hai, jisse hum apni hosting se site ko manage kar sake.

    Cpanel ke through Reseller, end user website owner, Administrator web browser ke jariye hosting manage karata hai.
    Simple words me Cpanel yane Control Panel hota hai jaha se hum hosting ko manage kar sakate hai.


    CPanel Ke through Hum Kya Kya Kar sakate hai?

    1. Email Manage Karana:
    Cpanel se hum new email create kar sakate hai aur usko manage bhi kar sakate hai.

    2. Domain Manage Karana:
    Cpanel ke through hum apne domain & subdomain ko manage karate hai.

    3. Site Data & File Backup:
    Apne site ka jo bhi backup hai usko hum yaha se manage karate hai, jaise site ka backup lena usko restore karana. Etc.

    4. WordPress Installation:
    Aaj kal jitane bhi blogger hai, unme se most of the blogger wordpress ko prefer karate hai. WordPress ko hume cpanel ke trough hi install karana padata hai.

    5. Database Management:
    Humari site ka jo database hota hai, usko bhi hume yahi se manage karana padata hai.

    Cpanel Add-ons:

    Cpanel me hume different add ons bhi milate hai. Isme hume alag alag prakar ke web application jaise wordpress, Joomla, Drupal milate hai.

    How to open site Cpanel:

    Apni site ka Cpanel kaise open kiya jata hai.
    Browser me Simple: www.yoursiteurl.com/cpanel open karle. Apni site ka name aur uske aage /cpanel lagane par aapki site ka Cpanel ka login open ho jata hai.
    Uske bad aapke user name & password se log in hona hai.

    Cpanel ki official site www.cpanel.com par jakar aap Cpanel ka demo bhi dekh sakate hai ki Cpanel kaisa hota hai, yaha hume kon si facility milegi. Etc.

    Waha pe jake CpanelDemo par click kare. Niche di gayi image me bhi aap Cpanel interface dekh sakate hai.

    What is Cpanel


              rubyでのsqlのopenメソッドの使い方について ---------- def initialize(dbfile) ...   
    rubyでのsqlのopenメソッドの使い方について ---------- def initialize(dbfile) @dbfile = dbfile end def create(zipfile) return if File.exist?(@dbfile) SQLite3::Database.open(@dbfile) do |db| db.execute(<<-SQL) 以下略 ---------- return if File.exist?(@dbfile)でファイルが存在しているとreturnをし、ファイルが存在していないとopenメソッドを使って新規にデータベースファイルを開く・・・と書籍の解説にあったのですが、ファイルが存在していないのに開くことができるという意味がよくわからないのですがこれ...
              Systems Analyst (Teradata Database Developer) - McCain Foods (Canada) - Florenceville, NB   
    Ability to assess new initiatives to determine work effort and estimate time-to-completion. The world’s largest manufacturer of frozen potato specialties,...
    From McCain - Tue, 04 Apr 2017 12:46:27 GMT - View all Florenceville, NB jobs
              Prevalence of and risk for gastrointestinal bleeding and peptic ulcerative disorders in a cohort of HIV patients from a U.S. healthcare claims database   
    - Source: journals.plos.org
              Database Tour Pro 8.2.4.33   
    × Database Tour Pro 8.2.4.33 Close Database Tour Pro 8.2.4.33 | 5.2 MB Database Tour and Database Tour Pro are cross-database tools with large set of db tools and utilities ...
              Postgresql block internals   

    This blogpost is the result of me looking into how postgres works, and specifically the database blocks. The inspiration and essence of this blogpost comes from two blogs from Jeremiah Peschka: https://facility9.com/2011/03/postgresql-row-storage-fundamentals/ and https://facility9.com/2011/04/postgresql-update-internals/

    Database: 

              Zebra puzzle as a SAT problem   

    Zebra puzzle as a SAT problem

    Database: 

              Friday Philosophy – “Technical Debt” is a Poor Term. Try “Technical Burden”?   

    Recently my friend Sabine Heimsath asked a few of us native English speakers what the opposite of “technical debt” was. My immediate reaction was to say:

    I’d say (sarcastically) “proper development” or “decent designer” or even “what we did 25 bloody years ago when we were allowed to take pride in the software we created!”

    Database: 

              Path to Better Presentations   
    Database: 

              Episode 12 – CosmosDB – The New Challenger in the Cloud-based NoSQL Arena   

    Recently Microsoft announced Azure Cosmos DB, the successor for DocumentDB. One of our Datascape regulars, Warner Chaves, called me up to tell me how excited he was and how significant he thinks that this development is to cloud-based databases.

    Database: 

              Come guardare il finale di Doctor Who 10 in streaming il 1° luglio: orari e programmazione   

    Il finale di Doctor Who 10 è arrivato e sarà trasmesso tra qualche ora sul canale britannico BBC One, ma anche noi italiani avremmo occasione di guardarlo tramite alcuni siti di streaming. Il finale di Doctor Who 10 è l'episodio degli addii: oltre ad essere il penultimo con Peter Capaldi nel ruolo del Signore del Tempo, è l'ultimo con Michelle Gomez nei panni di Missy, ed è anche quello che chiuderà l'era di Steven Moffat, showrunner della serie tv dal 2010, che è stato al timone dello show a partire dalla quinta stagione. Il dodicesimo episodio della decima stagione si intitola "The Doctor Falls" e ha una durata complessiva di 90 minuti, invece dei canonici 45; di seguito la sinossi: Il Dottore si trova faccia a faccia contro un esercito di Cybermen nel tentativo di proteggere la razza umana da un destino impensabile. Quando inizierà il finale di Doctor Who 10? In Gran Bretagna, l'episodio andrà in onda alle 18:30, ora locale, su BBC One. In Italia sarà possibile guardarlo alle 19:30, mentre sulla BBC America sarà trasmesso alle 20:30. Per chi volesse tentare la diretta streaming, basterà collegarsi sul sito FilmOn che permette gratuitamente di usufruire dei canali britannici e scozzesi comodamente seduti sul proprio divano di casa. Non è prevista iscrizione. Chi invece non riuscirà a collegarsi, domani mattina l'episodio sarà già disponibile sul sito ufficiale di Doctor Who, in lingua originale e senza pubblicità. Infine, per coloro che preferiscono attendere i sottotitoli italiani e non vogliono aspettare la messa in onda nel nostro Paese, potranno scegliere di vedere Doctor Who sul web, cercando sul motore di ricerca un buon sito che vi consenta di guardare l'episodio in modalità gratuita e con i sottotitoli integrati (ci sono molti website utilissimi, come SerieTv SubIta, sempre aggiornati con le ultime serie tv inserite nel database). https://www.youtube.com/watch?v=fjHIwCIlRvw Doctor Who tornerà il 25 dicembre in occasione dello speciale episodio natalizio.
              Wildlife Biologist II – Baffin - GOVERNMENT OF NUNAVUT - Pond Inlet, NU   
    Applied knowledge of statistical procedures, applications, data tabulation, computer applications coupled with the ability to establish databases and geographic... $97,734 a year
    From Indeed - Fri, 17 Mar 2017 19:01:25 GMT - View all Pond Inlet, NU jobs
              Senior Technical Writer - Oracle - Canada   
    Review technical information prepared by other staff members for clarity and content. The Oracle Database User Assistance team creates the user assistance ...
    From Oracle - Thu, 22 Jun 2017 19:13:01 GMT - View all Canada jobs
              Patent Agent - ITIP, LLC - Greater Toronto Area, ON   
    Duties include, but are not limited to, filing Canadian patent applications, managing a large database and dockets, reviewing patent application documents, and... $120,000 a year
    From Indeed - Fri, 30 Jun 2017 17:03:28 GMT - View all Greater Toronto Area, ON jobs
              Exportizer Pro 6.1.2.28 Multilingual   
    Exportizer Pro 6.1.2.28 Multilingual | 3.4 Mb Exportizer Pro is a database export tool. It allows to export data to database, file, clipboard, or printer. Exportizer Pro works with databases via ADO, BDE, or Interbase/Firebird. It can open ODBC data sources, files of DB, DBF, MDB, ACCDB, XLS, XLSM, XLSB, GDB, IB, FDB, HTML, UDL, DBC, TXT, CSV types, and databases specified by ADO connection strings.
              TweakBit Driver Updater 1.8.2.1 Multilingual Portable   
    TweakBit Driver Updater 1.8.2.1 Multilingual Portable | 11.8 Mb Driver Updater will scan your computer for outdated or missing drivers and provide you with an easy way to download and install the latest driver versions, which effectively resolves driver-related system errors and device malfunctions. With access to a comprehensive database of over 200,000 drivers, you can be sure you will always have the latest updates and enjoy uninterrupted device operation.
              Drupal Commerce: request - customizable user visible order numbers ...    

    Drupal 8 Commerce 2.x needs to have a native, admin UI friendly, way to create custom, user visible, order numbers, even if the underlying order number (order ID) remains a hidden numeric database incremented integer field sequence.

    This was a glaring problem and deficiency in Drupal 7 commerce, its use of SQL database incremented order number "ID" sequence for order numbers ("IDs") and did not natively provide a way to create custom/user experience friendly order numbers. In response, various people jerry rigged a variety of solutions via various contrib modules and other solutions that papered around a fundamental problem.

    Drupal 8 Commerce 2.x needs to natively and organically solve the problem of customer and store owner creation of order numbers. This far in my testing, Commerce 2.x still generates just a primitive sequence number and there isn't any UI on the store object for configuring or customizing order number generation or manual incrementation.

    https://www.google.com/search?q=custom+order+%22drupal+commerce%22+site%...

    List of various Drupal 7 modules and other kludge solutions to the lack of order number customization:

    https://www.drupal.org/project/commerce_order_counter
    - not maintained since 2013
    - - in commerce 1.x the order number was officially called an "order ID"
    - - commerce 2.x UI display "order number" at the top of custom order review page (an integer (sequence) on new test site).

    https://www.drupal.org/project/custom_order_number

    (issues queue thread - 2011-2015)
    https://www.drupal.org/node/1257180
    - bug list suggestions on how to hack with PHP code without any commerceguys decision to address this problem in the product

    ---------------------------

    Drupal 8 Commerce 2.x it appears that someone has built an alpha contrib module to address this problem, but seriously, this needs to be in the core of commerce. This is a basic and essential feature, not something "fancy" or peripheral.

    https://www.drupal.org/project/commerce_order_number
    - Drupal 8 module again trying to kludge around this fundamental problem/ommission in drupal commerce design.


              Revit Architeture Certified Professional - Revit Architecture Grundlæggende + Videregående   
    På dette kursusforløb gives en grundig og bred gennemgang af funktioner i Revit Architecture. Kurset starter med projektopbygning gennem Revits Project Browser, derefter vil der blive gennemgået de grundlæggende funktioner i en database struktureret model.<br /><br /> Der vil blive gennemgået hvordan bygninger opbygges med brug af eksisterende databasebaserede objekter (Families), hvordan man nemt kan redigere og oprette egne objekter ud fra eksisterende System Families. Der arbejdes med oprettelse af egne objekter (Families).
              Log usage data in a different logging database by using Windows PowerShell   
    In this article we can see how we can Log usage data in a different logging database by using Windows PowerShell
              Service applications that have its own database in SharePoint 2013   

              Re: Daniels   
    Olivia, I would like more information about your Perrin DANIELS line. I have several in my database and descend from at least one of them. However, your message here makes me question if I have mixed up some of the lines.

    Care to share? Or, have a GEDCOM you might be able to send? I am happy to share any info I have with you. However, I do not believe I have a Nancy Jane HENDERSON among my clan yet.

              Almost Famous Jeans Shorts   

    Restlessdress - YouTube
    Jeans from JcPenney (Almost Famous) Sweater and Jacket from Forever 21. Heels from Kensie Girl Awkward White Guys In Shorts. ... View Video

    Justina Gil - YouTube
    Almost Famous (1/9) Movie CLIP - Drugs & Promiscuous Sex (2000) HD This is the absolute easiest way to make distressed jean shorts/jeans, all you need is a pair of jeans and some scissorsI hope you f ... View Video


    Work of this artist accepted by the people of the time, what were this artist’s most famous When attending any opening reception for an art show, the artist is almost always in attendance. Dress appropriately – no jeans or shorts! You may wear khakis, pants, skirts, dresses, etc. ... Read More

    Images of Almost Famous Jeans Shorts

    Dear Sir Or Madam,
    Bermuda shorts. Among the men's trouser types which have been favorites with Her famous coat, with long thin sleeves, a pleat in back, almost always at the bottom of their list of preferences. ... Read Content

    AllRovi - Wikipedia, The Free Encyclopedia
    AllRovi is a commercial database launched by the Rovi Corporation in 2011. It combines information and reviews about music and movies from the former services AllMovie and AllMusic. ... Read Article

    Mountain Man Outpost
    The famous Wilderness Road serves as the main street of Newbern, which is across the New River Do not wear camouflage or jeans. Shorts are discouraged give you a test (don’t wait until free time is almost over or you won’t be able to complete ... Read Full Source

    Case Studies: Disruptive Student Behavior
    Slipped …. they haven’t just slipped below his shorts (if he had been wearing shorts), but his low-rider jeans have slipped so far that is (almost) open access. The student body is diverse in both age, race The Entitled (Students of the rich and famous) February 2012, ... Read More

    BRAZIL - ACCA | The Association Of Chartered Certified ...
    Grown by an average of almost 4% since Luiz Inácio Lula da Silva took in jeans or even shorts. Ask beforehand about the dress code. Shorts and flip-flops Most famous citizen: Ronaldo Luís Nazário ... Fetch Doc

    The Launch Pad
    Top or a bright colored, wild-patterned tank top almost anywhere that will fit almost any of the skinny jeans that rocked in the winter, who doesn’t love jean shorts? ut don’t let the jeans take all the Upside: you just can’t go wrong with the world famous Old Navy flip-flops!!! ... Content Retrieval

    Brema Backstage - YouTube
    It' almost impossible for a biker don't have a Brema jacket inside the wardrobe, they are famous for the perfect fit and the high quality fabrics. 2:50 Watch Later Error Diesel Jeans and Shorts by Brandsdistribution 686 views ... View Video

    Almost Famous Jeans Shorts Photos

    The 9th Annual Summer Music Festival At Walnut Hill
    We will attend concerts including one at the world famous Tanglewood Music Center, featuring Ladies: Dresses, dress pants (loosely fitted, no shorts, no jeans), skirts, blouses, and dress shoes (no Travelers' checks and credit cards are accepted almost everywhere. ... Document Viewer

    What's Your Non Scale Victory ( Brag Away ) - Page 44
    Where we go out in the backyard when it's covered in snow and start runningbarefoot in our shorts Haha I guess I'm kind of like that old Icelandic dude, Wim Hof(funny name XD) who's famous for doing my 'too tight' jeans down to my taller, slimmer, gorgeous sister, and getting her 'fat jeans', she ... Read Article

    Pictures of Almost Famous Jeans Shorts

    Virginija Rupainienė, Beata Baskakovienė, Sandra Shaw,
    B Two pairs of jeans, shorts and clean underwear She came from an obscure village in Eastern France and started being famous when she was little At the Plaza cinema there are almost twice as many seats as in the Celebrities. ...

    Inexpensive Must Have: Almost Famous Jean Shorts










    Access This Document

    The Best Mary Janes - Shoes For Girls - Kids' Fashion
    Girls can wear them with anything, dresses, skirts, jeans, slacks; almost any outfit looks complete with a Lelli Kelly shoes are famous for their feminine, frilly, floral patterns. They look fresh and modern, especially when paired with capris or short shorts. ... Read Article

    Photos of Almost Famous Jeans Shorts


    shorts, men’s sandals, jeans, or sneakers allowed. Board 6:30 pm, Cruise 7:30 to 10:30 pm HELICOPTER TOURS THE BIG APPLE - $181/ person See the United States’ most famous landmark, The Statue of Liberty, so close you can almost reach out ... Fetch Document

    Images of Almost Famous Jeans Shorts

    CC Reative Outure
    Its sales almost exclusively through the Web. “If the famous. It has hit mainstream America hard in the past few years with its in today’s jeans-and-shorts world and conveys ... Read Full Source


    We stock almost anything you could possibly need for that special Famous brand sports and casual wear, caps, clothing, shin pads, socks, sweats, shorts and tonnes extensive range of styles and sizes in both ladies and men’s jeans. ... Access Full Source

    Cinema.usc.edu
    Helped to make an almost-lost part of gay male now the middle-aged king of the except for a "Calvin Klein Jeans" on the front and "coming out" entry (" 'New Gay Shorts' . . . The playmates may titillate us with their smooth, famous photographer tells all "4) and yet re- ... Visit Document

    U.S. DEPARTMENT OF COMMERCE PATENT AND TRADEMARK OFFICE
    Jackets, jeans, jogging suits, jump suits, neck ties, shorts, boxer shorts, gym shorts, slacks, sport coats, sport prove that the DAVIDOFF mark is famous or is used in many ... Retrieve Doc

    Almost Famous Jeans Shorts Pictures

    Jongb11.weebly.com
    10/17/2012. 0 0. 0 0. 1999 6.56. 2008 8.7100000000000009. 2008 6.55. 2008 4.37. 1957 4.3600000000000003. 1995 8.5299999999999994. 2009 14.5. 2010 8.2200000000000006. 2010 4.37 ... Retrieve Here

    Pictures of Almost Famous Jeans Shorts

    Atlanta’s Merchandise Marketplace
    6, 6 1/2 AND 7 FAMOUS BRANDS, EXCELLENT CONDI-TION, VARIOUS COLORS WALKER, ALMOST NEW, USED VERY LIT-TLE, $40 404-761-7607 HAPEVILLE, GA Dress pants, jeans, shorts, dress shirts, long sleeve tops, short sleeve tops, ... Access Doc

    Almost Famous Jeans Shorts Images

    North And South - Welcome To CrossFit: Forging Elite Fitness
    By almost all accounts, Bagent was a hit at the Northeast famous June gloom, as the teams hit the rowers and Ronnie Teasdale and the parade of jeans shorts—the ... Fetch Content

    Destination Information Guide Ethiopia - Luxury Tours And ...
    Sometimes to almost freezing at night. The eastern tops, jeans, khakis, and shorts are satisfactory. unlike those of famous explorers of an earlier period such ... Read Here

    2009 New York Life International Global Summit
    NO jeans or shorts NO jeans or shorts. Business: Suit, dress or pantsuit Sport jacket and slacks or suit; tie The coach then travels to Spier Wine Estate, one of the most famous wine estates in the Cape. Almost all hotels, shops and restaurants, and even national parks and game reserves accept credit ... Fetch Here


    Very famous feast Halloween is celebrated on 31st October. peak of Slovakia, also located in this national park, is Mt. Kriváň 2495 m. high. Almost two For sports we put on sports wear, such as jeans, shorts, T-shirts. ... Fetch Document

    Teen Fashion Sitemap - Page 4 2012-10-19 - Spiderbites Of ...
    Her jeans and button-down look totally fresh and modern, thanks to her funky choice of Erin Conroy of Famous Footwear gave me the scoop on the must-have versions of these The weather is finally starting to warm up, so it's almost time to break out the summer shorts! ... Read Article


              #10: Transact-SQL Cookbook: Help for Database Programmers   
    Transact-SQL Cookbook
    Transact-SQL Cookbook: Help for Database Programmers
    Ales Spetic , Jonathan Gennick
    (12)

    Buy new: CDN$ 17.59

    (Visit the Bestsellers in Web Development list for authoritative information on this product's current rank.)
              Payroll Administrator - H&H Enterprises - Las Vegas, NV   
    Ensure accuracy of payroll records by maintaining database with updates in status changes, tax withholdings, benefits deductions, time off accruals....
    From Indeed - Tue, 06 Jun 2017 16:32:44 GMT - View all Las Vegas, NV jobs
              Director, Grants - Liberty Science Center - Jersey City, NJ   
    Proficiency with donor database (Tessitura a plus), Microsoft Office Suite and internet applications. Liberty Science Center (LSC) is seeking a driven, results...
    From Liberty Science Center - Wed, 05 Apr 2017 20:11:15 GMT - View all Jersey City, NJ jobs
              Database Administrator - Fund Development - La Rabida Children's Hospital - Chicago, IL   
    Generate monthly and special reports and forecasts of donors and prospects according to source, purpose, gift level and solicitation activity using donor...
    From La Rabida Children's Hospital - Thu, 29 Jun 2017 06:17:42 GMT - View all Chicago, IL jobs
              Links and Resources for May is Mental Health Awareness Month    

    Come to the library this month and check out our display on "May is Mental Health Awareness Month", or explore these links:

    United Way of Connecticut - Mental Health Care Links

    National Institute of Mental Health

    You can access our database, the HEALTH AND WELLNESS RESOURCE CENTER. From home, you need to enter your Hamden Library card number. This database offers full-text access to various reference resources, like the "The Gale Encyclopedia of Medicine", and "The Gale Encyclopedia of Mental Health" with online updates from 2012. You can also access full-text articles from medical journals, pamphlets, newspapers, and find information on various diseases and conditions.

     Explore materials dealing with mental health in our online catalog.

    The Harriet Beecher Stowe Center in Hartford, CT will offer a discussion on "Mental Health: Stigmas, Stereotypes and Solutions", on Thursday, May 16, 2013 from 5 - 7 pm.



              (0-2 yrs) X Byte Technolabs Hiring For .NET Developers @ Ahmedabad   
    X Byte Technolabs [www.xbyte-technolabs.com] Openings For .NET Developer with (Web Scraping/crawling) in IT Company @ Ahmedabad Job Description : *Design, build & improve our distributed system of web crawlers *Integrate the data crawled and scraped into our databases *Experience with web crawling/ Web Scraping project *knowledge of web technologies (HTML, CSS, Javascript, ASP.Net MVC 4) ...
              (0-2 Yrs) WireFuture Hiring For Asp.Net MVC & Asp.Net Webform C# Developer @ Ahmedabad   
    WireFuture [www.wirefuture.com] Looking for 1 Asp.net MVC & 1 Asp.Net Webform C# Developer (0-2 Yrs) @ Ahmedabad Job Description : Skills We are looking for : 1) Design and development of web application using .NET technology (C#, ASP.NET MVC / Asp.Net webforms, Web services, HTML, JavaScript) 2) Should be strong in Database using MSSQL server ...
              Hi-Tech Outsourcing Services Hiring For Freshers : Technical Research Executive – Jr. Programmer @ Ahmedabad   
    Hi-Tech Outsourcing Services [www.hitechos.com] Openings For Technical Research Executive – Jr. Programmer ( Fresher) @ Ahmedabad Job Description: Primary skill sets: 1. Programming languages: Java, PHP, VB.net, ASP.net 2. Database: SQL and MS Access 3. Knowledge of Internet and offline macros Secondary skill sets: 1. Should have knowledge of Regex and Visual Basic. 2. Should ...
              Walk-In @ Gateway Technolabs : Android Developers : Ahmedabad, Gandhinagar : On 9 January 2016   
    Gateway Technolabs (Gateway Group of Companies) [www.gatewaytechnolabs.com] Walk In Interviews For Android Developers On 9th January 2016 at Ahmedabad, Gandhinagar Job Description : Should be good at making the architecture for Android Applications Good Knowledge of Android SDK’s and NDK Push Notifications Map Integration Social Media Integration Core Data/SQLite Database In-App Purchase Good Communication Skills Job ...
              SPRAT Hiring For Freshers : For .Net, C# Developers at Ahmedabad   
    SPRAT – Society for Rational Thinking [www.sprat.in] Openings For .Net, C# Developers (Freshers) at Ahmedabad Job Description : To design and build applications using n-tier architectures (preferably in MVC using C#); to design and build optimal SQL Server databases; to use JQuery or similar UI technologies, and deliver sustainable and extensible web applications. Salary: INR ...
               Organic Farm & Garden Manager- College of Marin- Novato, CA   

    Job description
    Manager, Organic Farm & Garden

    Salary: $6,565.08 - $8,324.17 Monthly

    Job Type: Full-time

    Location: Indian Valley Campus, Novato, California

    Description

    OPEN UNTIL FILLED
    Priority Screening Date: May 4, 2017, 12:00 pm Pacific Time

    All application materials must be received by the Priority Screening Date in order to be considered during the initial screening. Applications received after this date may be considered thereafter at the discretion of the college until the position is filled.

    Under the direction of the assigned Dean/Director, manages and directs the planning, agricultural production, marketing, partnerships, and outreach for the Indian Valley Organic Farm & Garden ('the Farm'). Manages the budgeting, marketing, management, and record keeping of the Farm, including maintenance of organic certification. Plans, supervises, and manages the farm/agricultural activities in support of the instructional and community partner programs. Additionally, oversees the maintenance and repair of a full line of equipment and farming infrastructure.

    DIVERSITY STATEMENT
    College of Marin strives to embrace diversity in all forms: it strives to be an inclusive community that fosters an open, enlightened and productive environment and demonstrates sensitivity to and respect for a diverse population.

    To Apply: http://jobs.marin.edu

    Essential Functions

    Responsibilities will include, but are not limited to, the following:

    Determines agriculture activities to be performed in support of the College mission. Performs crop management actions in a timely manner, practicing sound organic crop practices with the goal to minimize environmental impact and maintain organic certification.

    Directs and is responsible for marketing the farm's activities and produce; manages the customer database and mailing listserv; organizes sales to employees and the community.

    Develops annual and longer-range business plans and crop rotation for the farm/agricultural operations and recommends changes and improvements in farm/agricultural operations, staffing, facilities, and equipment.

    Coordinates with the Dean of Career & Technical Education, Director of Community Education, faculty and instructional staff to support the needs of the instructional program.

    Coordinates with faculty and staff on student educational programs.

    Coordinates farm/agricultural operations on- and off-site, including participation in educational activities, farmer's markets, use of farm/agricultural facilities for special events, tours, and other community interests and opportunities.

    Cultivates the College's ties with alumni, the surrounding community, and its current and potential stakeholders; maintains strong community relations with the public.

    Investigates and experiments with a range of agricultural methods to find practices that are environmentally and economically sustainable.

    Supervises and participates in daily farm/agricultural operations which include: soil preparation, planting, irrigation, cultivating, harvesting, and storing of crops and produce.

    Oversees the daily operation and maintenance of farm/agricultural machinery, equipment, and facilities.

    Organizes, schedules, assigns, and reviews the work of assigned farm/agricultural employees.

    Conducts orientation and training sessions for farm/agricultural workers, students, and volunteers assigned to the farm.

    Manages the farm as a model of sustainable, organically certified practices that other farms in the surrounding region can use to support their own operations; shares knowledge gained with others in the agricultural community through membership in area farming organizations, occasional talks and presentations, etc.

    Periodically inspects farm/agricultural buildings, grounds, and equipment to identify safety and sanitary hazards and initiates immediate or preventative maintenance measures; implements security measures to prevent theft and vandalism.

    Assures compliance of farm/agricultural operations to Federal, State, and local laws and regulations concerning health, safety, and sanitation.

    Develops an annual budget and makes requisitions for supplies and equipment as needed.

    Maintains records and production reports on farm/agricultural activities, particularly in support of maintenance of organic certification, and issues fiscal reports on farm/agricultural activities and budget expenditures on a regular basis.

    Performs other duties as assigned.

    Requirements

    Requirements & Desirables:
    A Bachelor's degree from an accredited institution in Agricultural Science, Agribusiness Management, Agricultural Economics, Crop Science, Agronomy, or a reasonably related field; and
    Two years of full-time, paid farming experience which included management/ownership of agricultural operations, small business operations, or equivalent; and
    A valid California Class "C" driver's license (travel to locations throughout the District may be required); and
    A Pest Control Certificate is required within 6 months of employment; and
    Sensitivity to and understanding of the diverse academic, socioeconomic, cultural, disability, gender identity, sexual orientation, and ethnic backgrounds of community college students.
    Desirable Qualifications
    A Master's degree from an accredited institution in Agricultural Science, Agribusiness Management, Agricultural Economics, Crop Science, Agronomy, or a reasonably related field;
    Experience with organic certification.
    KNOWLEDGE & ABILITIES

    Knowledge Of
    Farming/agricultural principles and methods and their application to diversified farm/agricultural activities
    Organic certification requirements and procedures
    Agronomy
    Small business management including experience in business planning, marketing of agricultural products, principles of public relations, maintenance and operation of farm/agricultural equipment and machinery
    Arboriculture and wildlife management
    State and local laws and regulations related to crop production, organic certification, small business management and farm/agricultural operations
    Health and safety laws and regulations applicable to farm/agricultural operations and product sales and including food safety
    California Food & Agriculture code, and water regulations
    Safety requirements for handling hazardous or toxic materials
    Agricultural waste disposal and sustainable agricultural practices
    Principles of supervision and training
    Basic principles of budget preparation and accounting
    Capabilities of computer systems, software, and hardware common to farm/agricultural operations and small business operations
    Ability To
    Manage day-to-day farm/agricultural operations in conjunction with an instructional program
    Develop and implement annual business plans
    Supervise, coordinate, and schedule the work of assigned staff
    Train others in farm/agricultural operations related activities
    Effectively communicate orally and in writing
    Give clear and concise instructions
    Interpret and apply rules and regulations related to farm/agricultural operations
    Establish and maintain effective and cooperative relationships with administrators, faculty, staff, students, vendors, and the community
    Analyze situations correctly and take effective action
    Evaluate work methods and performance
    Prepare and maintain accurate reports, keep accurate records and meet schedules and time lines
    Knowledge of Microsoft Word, Excel and e-commerce software
    Learn specialized computer applications
    CONDITIONS OF EMPLOYMENT
    Prior to employment, the selected candidate will be required to complete the following:
    In accordance with Federal Law all employees must provide proof of eligibility to work in the United States.
    Criminal Justice/Fingerprint Clearance.
    California Education Code, Section 87408.6 requires persons employed by a community college in an academic or classified position to submit to a TB risk assessment developed by CDPH and CTCA and, if risk factors are present, an examination to determine that he or she is free of infectious TB; initially upon hire and every four years thereafter.
    DISASTER SERVICE WORKERS: All Marin Community College District (MCCD) employees are designated Disaster Service Workers through state and local law (http://www.leginfo.ca.gov/cgi-bin/displaycode?section=gov&group=03001-04...). Employment with the MCCD requires the affirmation of a loyalty oath to this effect. Employees are required to complete all Disaster Service Worker-related training as assigned, and to return to work as ordered in the event of an emergency. For more information, please see the http://www.marin.edu/police/EOP.html.
    Candidates applying for positions with the Marin Community College District may be disqualified from consideration should their conviction history not meet the standards established under the California Education Code.
    Supplemental & Salary Information

    WORKING ENVIRONMENT
    While performing the duties of this job, the employee has ongoing physical strain and or/muscular exertion; uses hands to finger, handle and feel computers and standard business equipment; and reach with hands and arms. The employee may operate vehicles and heavy machinery in which manipulative skills and hand/eye coordination are important ingredients of safe and/or productive operations.

    The usual and customary methods of performing the job's functions require the following physical demands: carrying, pushing and/or pulling; significant climbing and balancing; significant stooping, kneeling, crouching and/or crawling; significant reaching, handling, and manual dexterity. Generally, the job requires 25% sitting, 60% walking and 15% standing. The job is performed under some temperature extremes, some hazardous conditions (e.g. mechanical, cuts, burns, infectious disease, high decibel noise, etc.) and in varying atmospheric conditions.

    CLASSIFICATION CATEGORY
    The Manager, Organic Farm and Garden is a classified administrative position, in compliance with all applicable sections of the California Education Code.

    SALARY INFORMATION
    FLSA Status: Exempt
    Salary Grade: MGMT 1
    Salary Range: $78,781.00 to $99,890.00 annually. Background and experience will determine placement.

    Please refer to the Management Salary Schedule on our Human Resources http://www.marin.edu/humanres/ for detailed information.

    SELECTION PROCESS
    Applications must include the documents listed in the Required Documents section to be rendered complete; incomplete applications will not be accepted. Screening will begin after the priority screening date. Applicants selected to interview will be contacted to schedule an interview appointment with the screening committee; however, applicants will be notified of their status, either way, following the screening. Regrettably, College of Marin is not able to offer reimbursement for travel to interviews at this time.

    To Apply: http://jobs.marin.edu

    To apply, visit: http://apptrkr.com/994556
    A Bachelor's degree from an accredited institution in Agricultural Science, Agribusiness Management, Agricultural Economics, Crop Science, Agronomy, or a reasonably related field; and
    Two years of full-time, paid farming experience which included management/ownership of agricultural operations, small business operations, or equivalent.

    Forums: 

              jClub Acquires Assets of Choxi to Create Exciting New Online Shopping Destination   
    ...the multi-million customer database, of Choxi.com Inc., an online shopping platform that declared bankruptcy in December 2016 . The acquisition allows jClub to expand their service offering and provide customers even better deals on consumer goods.   "We're excited to ...

              Data Logger Market Size, Trends, Development, Key Manufacturers Analysis and Forecast Report 2017-2025   
    Data Logger Market Size, Trends, Development, Key Manufacturers Analysis and Forecast Report 2017-2025 The Insight Partners added “Data Loggers Market - Global Analysis to 2025” to its research database. The report is spread across 150 pages and supported by 8 company leaders. Data loggers are electronic gadgets that are designed to record data

              Senior Oracle Database Administrator - Wesco Aircraft - Austin, TX   
    7+ years’ experience using UNIX and or Linux Red Hat in Oracle environment5+ years' experience in Oracle RAC, ASM, and Physical Standby/Active Data Guard...
    From Wesco Aircraft - Mon, 12 Jun 2017 20:10:00 GMT - View all Austin, TX jobs
              General Accountant - Atmosphere Real Estate - Dubai   
    Advanced experience in Accountant software, Database, MS Office. Atmosphere Real Estate is looking for General Accountant to be part of our Team!...
    From Indeed - Wed, 28 Jun 2017 12:07:31 GMT - View all Dubai jobs
              Get Ready for Winter Activities with a Combination of Stretching and Massage Therapy   

    Staying fit during the winter holiday season comes with a long list of unique challenges and obstacles, but staying injury free can take even more of an effort when you introduce new, winter-specific activities like skiing, skating and sledding into your fitness routine. Before the winter activity season is in full swing, take steps today to warm up and loosen your muscles so you can hit the slopes or the rink pain free, as well as minimize after-activity soreness and fatigue.

    Warm Up and Engage New Muscle Groups during the Pre-Season to Stay Injury Free

    Before carving fresh tracks down the slopes this winter or lacing up your skates to hit the rink with your kids, it is important to incorporate a combination of regular stretching and therapeutic massage sessions for an active and injury free winter season. The key to keeping active during the winter months and staying off your family’s injured list is to focus on body flexibility and lengthening your muscles in the pre-season. Many of the popular wintertime activities (skiing, snowboarding and skating) physically impact your lower body, thereby creating a need for you to focus your stretching and strength efforts on hip, hamstring and trunk/lower back flexibility.

    According to the National Academy of Sports Medicine, there are two common types of stretching – static stretching and dynamic stretching – that are good for promoting overall flexibility. Research indicates that holding a static stretch for 20-30 seconds allows your muscles time to relax and elongate, thereby increasing joint range of motion. Dynamic stretching on the other hand includes low intensity exercises that mimic sport specific movements. These types of stretches are good for warming up your body prior to a sports activity, as they help increase circulation, reduce muscle tightness and help your nervous system’s ability to contract muscles forcefully.

    To kick off your winter pre-season regimen, it’s a good idea to combine consistent stretching sessions with routinely scheduled monthly massages 8-12 weeks prior to the start of your favorite wintertime activity. Regular therapeutic massages prior to your desired activity allow your body to release the toxins found in tight muscles, while increasing overall flexibility and circulation. Additionally, your massage therapist can assess and monitor your body’s flexibility range, while suggesting specific stretches and other techniques that will focus on lengthening and strengthening your body’s problem areas.

    Remedy Your Winter Aches and Pains with Regular Massage Body Work

    As the snow begins to fall and the barometric pressure takes a dive south, your body faces some unique challenges, especially as you get older and recovery times for muscle injuries and overuse get longer. Even when you focus on preparing your body for winter wear and tear before the season starts, there still may be an unfortunate event where you will become injured or experience some sort of ache and pain associated with muscle overuse and fatigue.

    Lower back pain, in particular, is a common injury culprit in the winter as you can overdo it shoveling snow, incorrectly bending over to push your children’s sleds or accidently slipping and falling on ice covered sidewalks. In fact, research indicates that 70-85% of the population will experience low back pain at some point and lower back pain is one of the most common and costly musculoskeletal problem in modern society. Luckily, research supports that massage therapy can minimize pain and disability, while increasing the speed of return to normal function. Massage specifically is beneficial for patients with subacute (lasting four to 12 weeks) and chronic (lasting longer than 12 weeks) non-specific low-back pain, especially when combined with exercises and education. Furlan AD, Imamura M, Dryden T, Irvin E. Massage for low-back pain. Cochrane Database of Systematic Reviews 2008, Issue 4. Art. No.: CD001929. DOI: 10.1002/14651858.CD001929.pub2

    Additional research from Group Health Research Institute, the University of Washington in Seattle, the Oregon Health and Science University in Portland, and the University of Vermont in Burlington revealed that massage therapy has helped reduce pain and improve function more rapidly than usual medical care in people with chronic low-back pain. Back pain is a health problem that affects millions of Americans and is the most common medical condition for which people use complementary and alternative medicine practices, such as massage therapy. Cherkin DC, Sherman KJ, Kahn J, et al. Annals of Internal Medicine. 2011;155(1):1–9

    Whether you are preparing yourself for family fun winter activities or recovering from a wintertime sports injury or accident, therapeutic massage sessions combined with a consistent stretching regimen should be your go-to strategy for minimizing aches and pains this winter season. Find your local Elements Therapeutic Massage location today to schedule your winter pre-season massage therapy sessions.


              Comment on Extended Discussion – Group 2 by Hannah Postel   
    As we've been reading the many articles about the environment and globalization, I've noticed the multiple (very different) data! Some authors use certain statistics completely contradictory to those in other articles. I can understand of course having different views about the same data, e.g. interpreting the information in different ways. It is of course difficult to compile data about the number of starving people, the number of people under the poverty level, etc, but I think this makes analysis dangerous. In order to be able to suggest any useful steps for the future, we must have correct information about the present. It is important to be able to attempt to figure out what has gone well in the past and what has not. While of course looking for a solution (or a compilation of multiple strategies) takes first priority, I think more progress could be accomplished if we could build up a database of generally accepted, scientifically proven data. Just as in a scientific experiment, results can not be accepted unless they are proven by multiple people multiple times. In order to be able to take further action, we should know where we stand now.
              FAQ   
    Akismet checks your comments and contact form submissions against our global database of spam to protect you and your site from malicious content.
              Web page search engine interface with SQLite/MySQL/ASP by dominionprosl   
    I have a SQLite database that I wish to be able to place on an IIS server and allow users to search the database. Although the database file is in SQLite format, it is possible to create a different DB file format if necessary... (Budget: $30 - $250 USD, Jobs: ASP, HTML, MySQL, SQL, SQLite)
              FAQ   
    Akismet checks your comments and contact form submissions against our global database of spam to protect you and your site from malicious content.
              Systems Analyst (Teradata Database Developer) - McCain Foods (Canada) - Florenceville, NB   
    Ability to assess new initiatives to determine work effort and estimate time-to-completion. The world’s largest manufacturer of frozen potato specialties,...
    From McCain - Tue, 04 Apr 2017 12:46:27 GMT - View all Florenceville, NB jobs
              PT Marketing & Sales/Lead Generation Assistant    
    PR/Marketing/Branding agency seeks a hardworking part-time flat rate plus commissioned intern to assist with online research, database building for mail merge (internally in MS Word, MS Access or MS Excel), and sales and lead generation for new business.

    Applicant should have a clear speaking voice as responsibilities will include contacting national theatres, cultural and civic organizations, colleges/universities and potential corporate sponsors. Intern will assist with media pitching and placement, commissioned sales lead generation and eventual artist/music talent booking. Intern may work virtually via laptop and Skype, but will be required to meet face to face weekly for approximately four hours. Intern will be in virtual communication with agency via email, online task management system, phone/text messages.

    Qualifications:
    - MS Office Suite proficient.
    - Touch-typing skills.
    - Dictation skills.
    - Internet research skills.
    - Congenial personality, communication, writing & organizational skills required.
    - Pro-active, detailed oriented & possess good follow-through while adhering to deadlines.
    - Proofreading & editing abilities, as well as problem-solving skills.
    - Able to multi-task & prioritize tasks sufficiently while working independently.
    - Strong interest in media, arts, entertainment & healthcare|wellness industries.
    - Reliable & have great time management skills.

    Please send letter a cover letter, resume & two writing samples: Part-time Assistant
              Re: Migration to the 3.2.2 bundle at Synology   

    by Pavel Posel.  

    Well, the disk station might not be the most powerful system but it provides sufficient power for the installation that provides courses to the dozens of students per year. Any dedicated server would be superfluous until it has more clients. On the other hand it is great to start up with the e-Learning and we have it as a supplement to the lecture-based courses where applicants might return back to the content. However now it disappeared although we have all data in the database and backup of the installation files. But I have no idea what the installation pack set elsewhere (except of the crone that I’ve set up manually.)

    Probably the installation bundle of a new version is wrong I don’t know if it is maintained by Synology or moodle.org - according to the portal, there is info, that Moodle is a third-party application.



              Web page search engine interface with SQLite/MySQL/ASP by dominionprosl   
    I have a SQLite database that I wish to be able to place on an IIS server and allow users to search the database. Although the database file is in SQLite format, it is possible to create a different DB file format if necessary... (Budget: $30 - $250 USD, Jobs: ASP, HTML, MySQL, SQL, SQLite)
              MSSQL Database Administrator Job - HealthPartners - Richfield, MN   
    Strong knowledge of HealthPartners' operating systems (i.e. Mentors and educates HealthPartners technical staff, vendors and database query users in efficient...
    From HealthPartners - Fri, 02 Jun 2017 06:29:24 GMT - View all Richfield, MN jobs
              Oracle Database Administrator Job - HealthPartners - Richfield, MN   
    Strong knowledge of HealthPartners' operating systems (i.e. Mentors and educates HealthPartners technical staff, vendors and database query users in efficient...
    From HealthPartners - Mon, 01 May 2017 21:23:19 GMT - View all Richfield, MN jobs
              FAQ   
    Akismet checks your comments and contact form submissions against our global database of spam to protect you and your site from malicious content.
              aplikuj - QUMAK - Polska   
    Oracle Database 12 c Administrator Certified Professional, Oracle Database 11 g Administrator Certified Professional....
    Od QUMAK - Wed, 08 Mar 2017 18:16:00 GMT - Pokaż wszystkie Polska oferty pracy
              Data Entry - Sandhills Publishing - Lincoln, NE   
    Data Entry is responsible for entering information into our equipment databases. Data Entry is responsible adhering to standards while reviewing this...
    From Sandhills Publishing - Tue, 20 Jun 2017 16:17:20 GMT - View all Lincoln, NE jobs
              Systems Analyst (Teradata Database Developer) - McCain Foods (Canada) - Florenceville, NB   
    Results-driven experience to collaborate with implementation and support teams to resolve complex data design issues and provide optimal solutions that meet...
    From McCain - Tue, 04 Apr 2017 12:46:27 GMT - View all Florenceville, NB jobs
              Web page search engine interface with SQLite/MySQL/ASP by dominionprosl   
    I have a SQLite database that I wish to be able to place on an IIS server and allow users to search the database. Although the database file is in SQLite format, it is possible to create a different DB file format if necessary... (Budget: $30 - $250 USD, Jobs: ASP, HTML, MySQL, SQL, SQLite)
              Driver Reviver 5.17.1.14 Full Serial Crack Download Free 2017   

    Driver Reviver checks analyzes and, if necessary, downloads the drivers according to the manufacturer of one of the world’s largest driver databases in order to quickly analyze device drivers to identify and download the latest drivers and Install automatically.It ensures that your PC and its components are in optimal conditions to have the best performance possible.It eliminates the risk of downloading a faulty driver or even malware.It scans all hardware to identify the current driver that is installed on your system, and determines if it is the most up-to-date version.It can take many hours to locate each driver – Driver

    The post Driver Reviver 5.17.1.14 Full Serial Crack Download Free 2017 appeared first on Full Version Softwares Free.


              Driver Magician 4.9 Crack + License Key Downlaod   

    Driver Magician 4.9 Crack + License Key Full Version Free Downlaod Driver Magician 4.9 Crack Is a specialist program that go angling at the driver in your pc and conserve your time. Motorist Wizard remove and can back-up, repair, update your motorists. The use of the program that is specialist is clear and quite simple all these devices drivers can be managed by you in your system in manner that is very simple. Furthermore, Wizard has a built-in database of the latest motorists with the ability to visit the Web. It spares groups of moment create motorists may certainly expand

    The post Driver Magician 4.9 Crack + License Key Downlaod appeared first on Full Version Softwares Free.


              TweakBit Driver Updater 1.7.2.4 Crack   

    TweakBit Driver Updater 1.7.2.4 Crack + Serial Key & License Key Download Greatly check confirms and your program you have most suitable drivers variations installed and the latest. The program functions an indepth check of the body to find your entire motorist that is old and lost than upgraded and install your driver with latest edition. It understands which motorists are turn, not present or old. By assessing its 26 million database with the many current and most alamode type of your motorists, Driver Updater fits your device after the inherent output signal. Finally, Driver Updater features the motorists and

    The post TweakBit Driver Updater 1.7.2.4 Crack appeared first on Full Version Softwares Free.


              Aeronautical Data Quality Engineer   
    MD-Lexington Park, Provide Subject Matter Expertise and Systems Analysis for quality of aeronautical databases supporting CNS/ATM RNP RNAV flight. Essential Job Functions: Provide system analysis and requirements analysis on data compliance with civil and military CNS/ATM aeronautical data quality standards and requirements. Provide systems analysis and requirements analysis on compliance with military and civil equ
              Postgresql block internals   
    This blogpost is the result of me looking into how postgres works, and specifically the database blocks. The inspiration and essence of this blogpost comes from two blogs from Jeremiah Peschka: https://facility9.com/2011/03/postgresql-row-storage-fundamentals/ and https://facility9.com/2011/04/postgresql-update-internals/ I am using Oracle Linux 7u3 and postgres 9.6 (current versions when this blogpost was written). Postgres is already installed, and […]
              Sr SQL Developer - ASSURANT - Wayne, PA   
    This Sr. SQL Developer will be responsible for: · Designing ER diagrams and building database objects. · Developing and enhancing stored procedures and DB
    From Assurant - Thu, 15 Jun 2017 13:26:17 GMT - View all Wayne, PA jobs
              Kidney Care Advocate- Full Time - Export PA - Fresenius Medical Care - Export, PA   
    Responsible to ensure accurate and timely documentation of patient interactions and status, through maintenance of SAP database, and/or electronic medical...
    From Fresenius Medical Care - Sat, 24 Jun 2017 20:29:34 GMT - View all Export, PA jobs
              Senior Oracle Database Administrator - Wesco Aircraft - Austin, TX   
    Previous work experience with Oracle JDEdwards ERP system, SAP Oracle database using BRTOOLS, Oracle 11g on Solaris10 systems administration, Linux systems...
    From Wesco Aircraft - Mon, 12 Jun 2017 20:10:00 GMT - View all Austin, TX jobs
              Senior Developer - .Net SQL - EXL - Jersey City, NJ   
    Experience working with Reporting Tools / Business Intelligence Tools. Experience in designing and building database using MS SQL Server and/or Oracle....
    From EXL - Tue, 18 Apr 2017 00:09:56 GMT - View all Jersey City, NJ jobs
              Scientist - Business Process Modeling and Simulation - EXL - Jersey City, NJ   
    Scientist - Business Process Modeling and Simulation. Ability to work with databases such as SQL Server, Oracle....
    From EXL - Tue, 18 Apr 2017 00:09:55 GMT - View all Jersey City, NJ jobs
              Implementation Consultant - General Electric - United States   
    Working knowledge of SQL and/or Oracle databases. Knowledge of Centricity Business and ambulatory financial GECB....
    From GE Careers - Tue, 27 Jun 2017 10:27:42 GMT - View all United States jobs
              GeoDataSource World Cities Database (Gold Edition) July.2017   
    GeoDataSource World Cities Database with Latitude Longitude Information
              Application Data Engineering Lead - Tradeweb Markets LLC - Jersey City, NJ   
    Work closely with front and backend developers to define database interface, model the database and engineer solutions that are performant and maintainable....
    From Tradeweb - Tue, 14 Mar 2017 18:45:17 GMT - View all Jersey City, NJ jobs
              Sr. Database Administrator/Developer - International Software systems - Maryland City, MD   
    Develop data schemas that are performant and meet the requirements of the client*. Database Administrator/Developer shall have extensive experience in database...
    From Indeed - Thu, 29 Jun 2017 18:36:20 GMT - View all Maryland City, MD jobs
              Sorry About Payment Delay Letter   
    Dear Members,

    We are having a problem sending out the deposit confirmation. I discovered this weekend that the payments made were not showing up on our database. It should be fixed by Tuesday, and the confirming emails will go out. We are sorry about the delay.

    Management
              Part-Time Dispatcher/Customer Service Rep - Cash for Trash - Stittsville, ON   
    Proficient computer skills (email, google, google drive, word, excel online databases etc). Handling and balancing cash....
    From Indeed - Wed, 24 May 2017 17:57:41 GMT - View all Stittsville, ON jobs
              Dispatcher/Customer Service Rep - Cash for Trash - Stittsville, ON   
    Proficient computer skills (email, google, google drive, word, excel online databases etc). Handling and balancing cash....
    From Indeed - Wed, 24 May 2017 17:49:26 GMT - View all Stittsville, ON jobs
              Wave 2 Deluxes and More for Transformers: The Last Knight Toys on Amazon.com   
    Fellow Seibertronian ScottyP has been updating the sightings database, as he does, and came across some intriguing listings on Amazon.com for the deluxe class Premier Edition Transformers: The Last Knight wave, including Slug (in a new deco from Age of Extinction), but also Drift, Sqweeks, Steelbane and Slash (note: through a third party but seemingly reputable seller). Also listed, this time directly via Amazon and Amazon Prime eligible, is the Turbo Changer Knight Armor Megatron, ... View the full news story on Seibertron.com by clicking here.
              Transformers: The Last Knight General Retail Street Date is Today   
    For all the retailers that were able to stick to the street date embargo for Transformers: The Last Knight, today 24th April 2017 is the official start of movie toy season on plastic figurine hunting grounds! Below, you can find a round-up of the stores (online and brick-and-mortar) which have already shown us where the toys are available. As always, make sure you flag up toy in the (fully up to date) Sightings Database when you come across them in a store, it really helps other collectors ... View the full news story on Seibertron.com by clicking here.
              Comment on PHP/MySql AJAX Poll script with pie and bar graph by LlSastre   
    Where is database structure?
              Nuclear history bibliography, 2014   
    It's time for the third-annual Nuclear History Bibliography wrap-up, that special feature of this blog where I spend a few hours searching academic databases for interesting keywords and then give you the results, with the aim of giving a rough guide to the state of the field as it is represented in print. The rules are the […]
              Sales Agronomist - Retail; IN/OH - Compass Minerals - Overland Park, KS   
    Solicits customer feedback to improve service; Responds to requests for service and assistance; Supports internal database development through CRM (Contact...
    From Compass Minerals - Tue, 18 Apr 2017 16:11:49 GMT - View all Overland Park, KS jobs
              Sales Agronomist - Retail; NE - Compass Minerals - Overland Park, KS   
    Solicits customer feedback to improve service; Responds to requests for service and assistance; Supports internal database development through CRM (Contact...
    From Compass Minerals - Tue, 18 Apr 2017 16:11:49 GMT - View all Overland Park, KS jobs
              Sales Agronomist - Retail: CA - Compass Minerals - Overland Park, KS   
    Solicits customer feedback to improve service; Responds to requests for service and assistance; Supports internal database development through CRM (Contact...
    From Compass Minerals - Tue, 14 Mar 2017 22:22:11 GMT - View all Overland Park, KS jobs
              Systems Analyst (Teradata Database Developer) - McCain Foods (Canada) - Florenceville, NB   
    Database development activities will include eliciting data requirements, source data analysis, design of ELT solutions, load and query performance tuning, data...
    From McCain - Tue, 04 Apr 2017 12:46:27 GMT - View all Florenceville, NB jobs
              Golf Equipment Sales FT- Edwin Watts Golf - Edwin Watts Golf | Worldwide Golf Enterprises - North Miami Beach, FL   
    Researches merchandise availability via computer database; The Sales Associate will maintain the &quot;Sure No Problem &quot;customer service philosophy by providing an...
    From Edwin Watts Golf - Sun, 23 Apr 2017 22:26:07 GMT - View all North Miami Beach, FL jobs
              Golf Equipment Sales PT - Edwin Watts Golf - Edwin Watts Golf | Worldwide Golf Enterprises - North Miami Beach, FL   
    Researches merchandise availability via computer database; The Sales Associate will maintain the &quot;Sure No Problem &quot;customer service philosophy by providing an...
    From Edwin Watts Golf - Sun, 23 Apr 2017 22:26:05 GMT - View all North Miami Beach, FL jobs
              Lab Technician - YG Dental - Denver, CO   
    Working knowledge of MS Office (especially Excel) and database systems. We are looking for a qualified Lab Technician to undertake a variety of laboratory...
    From YG Dental - Tue, 27 Jun 2017 12:42:10 GMT - View all Denver, CO jobs
              Lead Building Engineer - Shea Properties - Denver, CO   
    Ensure all building and common area service requests are documented in proper database and completed in a timely manner.Monitor service requests and....
    From Shea Properties - Fri, 14 Apr 2017 20:00:49 GMT - View all Denver, CO jobs
              Data Entry Assistant - EUCI - Denver, CO   
    Through internet searches and other data entry efforts, the data entry assistant helps grow the database with prospective contact information.... $11 an hour
    From Indeed - Mon, 03 Apr 2017 14:46:22 GMT - View all Denver, CO jobs
              Data Entry Assistant - Academic Impressions - Denver, CO   
    You will perform internet searches and other data entry efforts to help grow the marketing database with prospective customer contact information which includes...
    From Academic Impressions - Tue, 14 Mar 2017 09:37:36 GMT - View all Denver, CO jobs
              Sales Lifecycle Associate - Brackish - Charleston, SC   
    Supporting the sales team to identify new business opportunities and by maintaining the customer database. Sales Lifecycle Associate....
    From Indeed - Wed, 07 Jun 2017 18:34:41 GMT - View all Charleston, SC jobs
              Purchasing & Estimating Agent - Crescent Homes SC LLC - Charleston, SC   
    Must be able to use word processing, e-mail, spreadsheets, database software, creation of reports and database maintenance....
    From Crescent Homes SC LLC - Sun, 07 May 2017 10:24:30 GMT - View all Charleston, SC jobs
              Computer Technician - Odyssey Logistics & Technology Corporate - Charleston, SC   
    Microsoft SQL Database admin a plus. We are currently searching for an eager Computer Technician for our International Forwarders Inc....
    From Odyssey Logistics & Technology Corporate - Fri, 05 May 2017 13:51:28 GMT - View all Charleston, SC jobs
              Accounting Assistant (Part-time) - Southern Current LLC - Charleston, SC   
    Accounts payable duties to include posting invoices, maintaining vendor database, ensuring proper authorization of invoices and processing check requests.... $12 - $15 an hour
    From Indeed - Fri, 17 Mar 2017 19:21:27 GMT - View all Charleston, SC jobs
              Marketing Automation and Lead Generation Manager - GlobalEnglish - Brisbane, CA   
    Working experience with Google Analytics, Salesforce, Pardot, and other Business Intelligence suites or other Marketing Database Management Systems....
    From GlobalEnglish - Fri, 30 Jun 2017 00:27:45 GMT - View all Brisbane, CA jobs
              Minimum Data Set Coordinator - Meadows - Chenoa, IL   
    Experience with database management. Minimum Data Set Coordinator (MDS) is a Registered Nurse that performs assessments on residents as mandated by the State of...
    From Meadows - Thu, 25 May 2017 22:17:33 GMT - View all Chenoa, IL jobs
              Analyst Business - Compass Minerals - Overland Park, KS   
    Financial and/or numerical database skills and proficiency preferred. Compass Minerals is a leading provider of essential minerals that solve nature's...
    From Compass Minerals - Thu, 11 May 2017 03:37:46 GMT - View all Overland Park, KS jobs
              Sales Agronomist - Retail; IN/OH - Compass Minerals - Overland Park, KS   
    Supports internal database development through CRM (Contact Relationship Management) software. Sales Agronomist - Retail;...
    From Compass Minerals - Tue, 18 Apr 2017 16:11:49 GMT - View all Overland Park, KS jobs
              Sales Agronomist - Retail; NE - Compass Minerals - Overland Park, KS   
    Supports internal database development through CRM (Contact Relationship Management) software. Sales Agronomist - Retail;...
    From Compass Minerals - Tue, 18 Apr 2017 16:11:49 GMT - View all Overland Park, KS jobs
              Sales Agronomist - Retail: CA - Compass Minerals - Overland Park, KS   
    Supports internal database development through CRM (Contact Relationship Management) software. Sales Agronomist - Retail:....
    From Compass Minerals - Tue, 14 Mar 2017 22:22:11 GMT - View all Overland Park, KS jobs
              Customer Service / Data Entry - B&J Groups, LLC - Morrisville, PA   
    Retrieve data from the database for customers as requested. Transfer data from paper formats into computer files or database systems using keyboards, data... $12 an hour
    From Indeed - Fri, 02 Jun 2017 13:46:28 GMT - View all Morrisville, PA jobs
              Immutability Changes Everything - Pat Helland, RICON2012   

    For a number of decades, I've been saying "Computing Is Like Hubble's Universe, Everything Is Getting Farther Away from Everything Else". It used to be that everything you cared about ran on a single database and the transaction system presented you the abstraction of a singularity; your transaction happened at a single point in space (the database) and a single point in time (it looked like it was before or after all other transactions).

    Now, we see a more complicated world. Across the Internet, we put up HTML documents or send SOAP calls and these are not in a transaction. Within a cluster, we typically write files in a file system and then read them later in a big map-reduce job that sucks up read-only files, crunches, and writes files as output. Even inside the emerging many-core systems, we see high-performance computation on shared memory but increasing cost to using semaphores. Indeed, it is clear that "Shared Memory Works Great as Long as You Don't Actually SHARE Memory".

    There are emerging solutions which are based on immutable data. It seems we need to look back to our grandparents and how they managed distributed work in the days before telephones. We realize that "Accountants Don't Use Erasers" but rather accumulate immutable knowledge and then offer interpretations of their understanding based on the limited knowledge presented to them. This talk will explore a number of the ways in which our new distributed systems leverage write-once and read-many immutable data..

    Cast: Basho Technologies

    Tags: ricon2012, immuitability, pat helland and RICON


              Hello again!   
    So, what does it take to wake me from my blogging slumber? A broken aircon unit making work "impossible" (according to Labour MPs, how would they know??) certainly helps.  A Royal baby? No.  A move to make porn harder to find? Not as such.  A clear and blatant threat to civil liberties and freedom?  Yes.  Oh yes... yes... yes David... that does it for me David... yessss...

    Yes, I'm talking about David Cameron's latest step away from anything remotely resembling liberal free-market small-state politics, the "default-on" policy.  You immediately know that this is nasty, because it has a snappy appealing title.  A sweet sugary unobjectionable layer of spin to cover the illiberal and unpalatable inside.

    Now, this is being presented as so, so reasonable because we all know porn is nasty, that children should be kept away from it, that it incites some men to carry out appalling crimes, and that anything that stops child porn being disseminated must be a good thing, right? Sorry, no.  Those are the reasons that I've heard being thrown around today in support of this move, and I'll start by squashing them.  But first, I'll just clarify a few points before anyone decides to take the wrong impression of me.

    Nothing that I say here is in any way, shape or form in defence of or condoning violent pornography, non-consensual sex or film/images of it, or child pornography.  

    I have a young daughter, and the thought of any of these makes me feel physically ill.  Quite frankly, anyone who tried that with anyone that I know would find themselves being taken for a ride, cuffed to a towrope behind one of my cars. Yes, I'd go to jail for that.  No, I wouldn't care.

    So, with that out of the way, let's turn to the justifications for this policy:

    Porn is nasty

    Yes, it is.  But it is legal (in many variants, at least).  There are many things that I consider to be nasty but which are legal. Should we make everyone who wants to do anything I don't approve of apply to me and opt-in before they are allowed to?

    More pertinently, if the bar is not set at what Parliament says is legal, where is it to be set - and who sets it?

    Children should be kept away from it

    Yes, they should.  In fact, that is the law as it stands.  18-rated material is not to be shown to minors.

    So this is not a change to the law, merely a change to the process.  Therefore, it is valid for us to look at the process and see whether this will help - which it won't.  What it will do is make parents feel that the State and the ISPs are doing their job for them, and that they can therefore opt out because the filter is "default on".  Unless the husband/boyfriend/older son has quietly defaulted off, of course.  And provided that the filter is perfect and catches every nasty thing without making any mistakes.

    And that brings me to one of the really serious problems - filters are just awful.  We fitted a filter to our kid's PCs, one of the leading ones in fact.  They both hated it.  It blocked a wide variety of perfectly acceptable websites - I recall the moment when we sent our daughter to a clothes website, to choose some holiday outfits.  It blocked that, because of the large amount of lingerie that could be viewed there.  We were forever approving exceptions, which is a hassle when the security is set at a level intended to defeat a determined 17-year-old lad.  Whenever that defeated us, we had to listen to the "I can't do my homework because of the filter" rants. How easy is it going to be to request an exception when David Cameron's appointee holds the codes, instead of the householder?

    And yes, sites that their schools sent them to in order to do their homework triggered the block. That is how sensitive porn filters are.

    That it incites some men to carry out appalling crimes

    Not proven.  Sociologists have tried to prove this on numerous occasions and failed.

    In any case, the men concerned will just be able to ring their ISP and ask for the filter to be taken off.  Then they will be able to look at disgusting pictures to their heart's content and - if there is a causal link flowing in that direction - they will be incited to go off and do horrific things.  Just like before.

    Personally, my view is that the arrow of causation is in the opposite direction, that men who are willing and able to carry out such crimes will (along with many other men) be attracted to porn. But that is just my view and I have no evidence to support it.

    That anything that stops child porn being disseminated must be a good thing

    I'd agree with this (wholeheartedly), but disagree that this policy will achieve it.

    First, as noted above, it is a filter that can be turned off at will.  So those that want to bypass it, can, err, turn it off.

    Second, I have (literally) no idea how these images (etc) are circulated.  But I very much doubt that it is via a searchable database on the open Web.  I suspect that other protocols are used, that the servers are locked down and only accessible to validated users as opposed to the Google spiders.  So even on a "default-on" broadband connection, I reckon the access will be no harder than before.

    If you disagree, think through this.  To filter a site out, the authorities will need to know where it is and what it contains.  Child porn (etc) is already very illegal.  So if they know enough to filter it out, they should be shutting it down instead.  Ergo, if we assume that the illegal sites are being shut down in line with the current law, the filter will only block legal sites and will leave the illegal sites untouched.

    So that deals with the positive reasons for the policy.  What about the reasons why we shouldn't do this?

    First, there is the collateral damage.  I hinted at this above - my daughter being unable to choose an outfit for her holiday because the site also showed lingerie.  I can assure you that any filter will either be a waste of time (letting all sorts through) or will make your online life a nightmare.  If you don't believe me, if you think a filter can be written which works perfectly, then all I can say is go write it!  I'll patent it for you, in return for a slice of the royalties.  We will make a mint, because the perfect online filter is a prime example of a product that is in high demand but which does not exist.  Larry Page and Sergey Brin will look like losers next to us.

    It's not difficult to realise why.  How do you plan to filter it?  By hand? Impossible, too much volume, just YouTube gets uploads of something like 48 hours worth of video per minute.  So to check YouTube alone, you would need several thousand people working 8 hour shifts doing nothing but watching YouTube.  Multiply that by every site on the web, and you soon have most of the country sitting down all day checking for porn.  Perversely, that would actually make sure that at least someone watched all of the worst stuff.

    So you need automation.  It can look at the content or the words, or a combination of both.  If it looks for both, or for just the content, then people will publish text-only sites with links and instructions for accessing secure servers.  If it looks at the words, then this page and (say) @_millymoo's blog (http://www.beneaththewig.com) will be blocked immediately.  Neither are pornographic.  Both are laden with juicy keywords.

    So there is the first argument.  You are going to have to opt-in for porn in order to read this blog. OK, I know the easy answer to that is to question why you would ever want to read this again, but bear with me on this for a while:

    • This blogpost questions government policy.
    • The government policy in question would hinder your ability to read this blogpost and its questioning of the government policy.  

    If that doesn't scare you, then you need to do some thinking.

    You could start by thinking about who defines the content that is excluded, who has oversight over that, and who has the right to challenge it.  Because once porn is out, why should children be exposed to unpalatable extremist views - such as terrorist sympathisers, for example?  Or racists? Or smoking (we're thinking of the kids, remember)?  Or global warming deniers...?

    In fact, how would you propose to argue in support of people being able openly to promote illegal activity via the web?

    Promoting illegal activity such as, say, the right to do something that is currently proscribed?  Or, if I may re-word that, calling for the law to be changed.

    So there is the second objection; just as there is no clear bright line that a filter can use to detect porn, there is no clear bright line saying where this should stop.  

    And, in case you were wondering, there is an easy way to argue in support of people being able openly to promote illegal activity, by any medium.  It's here. Go read it.
              Systems Analyst (Teradata Database Developer) - McCain Foods (Canada) - Florenceville, NB   
    McCain Foods is seeking a Systems Analyst, specialized in Teradata database development, to contribute to the success of our Enterprise Data Warehouse (EDW)...
    From McCain - Tue, 04 Apr 2017 12:46:27 GMT - View all Florenceville, NB jobs
              Ingénieur d'application - Eaton - Quebec   
    With Enterprise Software applications, database. Customer requirements from substations integration to enterprise level....
    From Eaton - Fri, 23 Jun 2017 15:26:45 GMT - View all Quebec jobs
              Greymatter   
    Greymatter is the original opensource weblogging and journal software. With fully-integrated comments, searching, file uploading and image handling, completely customisable output through dozens of templates and variables, multiple author support, and many other features—while having perhaps the simplest installation process and easiest-to-use interface of any program offering this level of functionality—Greymatter permanently raised the bar for weblogging and journaling, and it remains the program of choice for tens of thousands of people around the world. For anyone interested in creating an online weblog or journal, Greymatter offers a new level of power and control. Take a look at the features that Greymatter offers: Runs On Your Account—Greymatter runs completely on your own server at all times, and is always under your full control; you're never dependent on the reliability (or privacy reassurances) of any outside source whatsoever. Comment Posting—Make your weblog/journal come alive with the built-in ability for your users to add comments directly to your entries (no PHP/SQL database required); every aspect of Greymatter's comments are completely customisable and controllable. One-Click Bookmarklets (Internet Explorer users only)—Now you can add a new entry to your Greymatter site from anywhere on the web, instantly linking to any other site with one click. Built-In Searching—Greymatter now lets you add searching to your site to allow visitors to search through your entries (with fully-customisable output, of course), as well as the ability to perform internal searching from within Greymatter—authorised authors can even easily search and replace text across all entries. Built-In File Uploading & Easy Image Handling .... Requirements An FTP client such as CuteFTP A web account which offers full support for Perl 5 software A modest comfort level with HTML code (to customise the templates) An imagination Tag:
              Systems Analyst (Teradata Database Developer) - McCain Foods (Canada) - Florenceville, NB   
    Florenceville-Bristol is a small community nestled on the banks of the Saint John River and offers a global work environment with a small-town pace of life....
    From McCain - Tue, 04 Apr 2017 12:46:27 GMT - View all Florenceville, NB jobs
              New Post: Extract comments which are inside the function    
    It's not possible to extract comments from within the function body. XML comments don't work that way. The only place that they can appear is outside of the function body. The compiler won't pick them up anywhere else.

    Just my opinion here but perhaps you're over-documenting. Reserve the XML comments for what's really important about the method such as why you're doing something a particular way rather than the minute details about how your wrote it. If a particular algorithm is complex, describe it in more readable terms rather than just giving a line-by-line break down from the comments within it. For example, describe the purpose of TestMethod and, if necessary, any complex processing that it performs. I don't need to know that you're checking settings and connecting to a database in the help file. I can see that stuff by looking at the code if I really want to do so. If the function is simple enough, the summary element may be all that's needed and the remarks element can be omitted.

    Also, please note that the project has been moved to GitHub. I'm more likely to see issues posted there and respond sooner.

    Eric

              New Post: Extract comments which are inside the function    
    I'm trying to extract comments from the function body like below but have no idea...

    [source code]
    public void TestMethod()
    {
    // 1. check the settings
    // 2. check the connection
    // 3. connect database
    }

    [help file]
    1. check the settings
    2. check the connection
    3. connect database
    I don't want to put those comments above the function declaration
    ///<remarks>
    /// 1. check the settings
    /// 2. check the connection
    // 3. connect database
    ///</remarks>
    public void TestMethod()
    {
    // 1. check the settings
    // 2. check the connection
    // 3. conncect database
    }

    I need to write the same comments twice and the remarks section is too large to read easily.

    Your help is highly appreciated

    Thanks

              Data Analyst - Addison Group - Oklahoma City, OK   
    Strong knowledge of and experience with reporting packages (Business Objects etc), databases (SQL etc), programming (XML, Javascript, or ETL frameworks)....
    From Indeed - Thu, 29 Jun 2017 15:35:57 GMT - View all Oklahoma City, OK jobs
              Update: KentouShogiban (Utilities)   

    KentouShogiban 1.76


    Device: iOS Universal
    Category: Utilities
    Price: $4.99, Version: 1.72 -> 1.76 (iTunes)

    Description:

    KentouShogiban is the App for examining positions of Shogi Game. You can record your moves with notes. You can also playback a Shogi Game record.


    [Features]
    ・Playback a game record
    ---Pasting copied game record
    ---Selecting a saved game on Saved Game List
    ---Selecting a shared game on the Shared Game List registerd by other users
    ---Selecting a public game on the Public Game List
    ---Selecting a game record file in the iTunes shared folder
    ---Selecting a game record file in the Dropbox folder
    ・Flip board
    ・Display the board coordinate axis labels
    ・Examine any positions of the game which you are playing back
    ・Record your note to each moves while examing
    ・Examine and record some branches(variation of the moves)
    ・Edit state of the game freely with no restrictions
    ・Import game record files from your strage
    ---Format:KIF, KI2(".kif", ".ki2")
    ---Encode:UTF8, Shift-JIS
    ---Zipped files are acceptable(".zip")
    ・Saved Game: You can manage game records storing in the local database by using folders
    ・Export saved game
    ---Format: KIF, CSA
    ---Encode: UTF8
    ---Output To: Dropbox, E-mail, Pasteboard
    ・Share Games: you can share games with other devices or other users
    ・Import game records of public major games from its site easily
    ・Visit public major game sites easily
    ・Retrieve your game records on Shogi club 24 easily

    [Support Device]
    4 inches or bigger

    [[Basic Usage]]
    [Views]
    There are three views. The switch button to shift views is available on top of the each views.

    ・Playback View
    Playback View is just for playing back a game record. You can see this view when the App is launced or you pase a game record text to the App by tapping the paste menu.

    ・Examine View
    Examine View is for examinig the position of the game moving pieces by yourself. You can examine some various set of moves as brances writing notes. When you move on this view from the Playback View, the position of the game on the Playback View is copied and it appears on the Examine View. You can save your examined moves and notes.

    ・Free Edit View
    Free Edit View is for freely replacing pieces regardless of the Shogi rules. You can use this view when you want to reset the location of some pieces at once. Additionally you can use this for making Shogi Problems. When you want to make the answer of a Shogi Problem, move on to the Exmine View and register the answer moves and save it. If you save the position of a Shogi Problem on the Free Edit View with the Handicap option as "Composed Shogi Problem", you can use filter option on the Saved Game List to distinguish Shogi Problems

    [Operations on the Examine View]
    ・Move a piece
    Pick up the piece you want to move by tapping, then tap the cell on the board or the piece stand.

    ・Make a branch(variation of the moves)
    Change to the position of the game, then press the Add Branch button(+).

    ・Shift branchs
    The branch button appears at the right side of the move on the move list. By pressing it, the move list will shift the branch move. If there are some branches on a move, the popup memu will appear by pressing the branch button. You can choose the branch you want to shift on the menu. To go back to the previous branch, press the back branch button.

    ・Remove a branch
    By pressing the remove branch button(-), the current displaying branch will be removed.

    ・Solve easy Shogi problem
    Press the Checkmate button on top of the move list.

    [Operations on the Free Edit View]
    ・Turn over a piece
    Swipe: swipe on a piece up or down.

    ・Rotate a piece
    Swipe: swipe on a piece left or right.

    [Search] (on Saved Game List, Shared Game List)
    ・AND filter: filter strings divided by white space (ex. 名人戦 羽生vs森内:名人戦 羽生 森内)
    ・EXCLUDE filter: `!' mark (ex. not include `NHK' string and 羽生vs森内:!nhk 羽生 森内)
    ・designate the number of moves (ex. less than 100 moves:m<100 )

    What's New

    ・Enabled to playback moves on the each playback pages of the games below by pasting its URL.
    - Akirao game(Eiou)
    - NHK Trophy
    - Kansai encourage meetings good office collection
    ・Changed the icons of the menu button and the submenu button on the toolbar
    ・Adapted to the new Dropbox API
    ・Fixed some minor bugs

    KentouShogiban


              Uric Acid - Natural Gout Treatment - Resources For Constructing A Free Gout Diet Plan   
    kidney stones in cats: So much of natural gout treatment requires diet and every gout sufferer has to pay very close attention to what they eat and drink. Imagine how useful a nutrition database on your home computer is; one that you can quickly access, without going to... Uric Acid
              Exchange 2010 Mailbox Database Recovery 8.7 (Shareware)   
    By Enstella for Exchange 2010 Mailbox Database Recovery Tool you get idea that How to Recover Mailbox from Exchange 2010 and convert it into PST Outlook/EML/MSG and HTML.Free software downloads ...
              Nero, Hoards and Aberdeen Ships   
    This month has seen an archaeological spoil heap the size of Nero's party leftovers. And it's been quite a month for Roman archaeologists who've just announced the positive identification of a very rare portrait of young Nero from the site of Fishbourne (the interview was recorded the day before 3D scans confirmed his identity). Also announced was the discovery of a very likely candidate for Nero's great banqueting hall. Our Anglo Saxon man, Tom Birch, discusses the incredible Staffordshire hoard and in Backyard Archaeology we find out about the Aberdeen database of ships with contributors from across the globe.
              Republican Representative Barbara Sears Blocks Effort Against Obamacare   

    In early March, Ohio State Representative Ron Young and Rep. Andy Thompson introduced a bill known as, “The Health Care Freedom Act,” (HCFA) that proposed a new line of defense against the Patient Protection and Affordable Care Act, or, Obamacare.  The bill, when passed, will prohibit health insurance companies in Ohio from accepting any federal funding that would trigger penalties for employers or individuals who aren’t compliant with Obamacare.  Wednesday, when the bill was brought up in committee, opposition arose; but not only from the expected side of the aisle.  While the Democrats did balk at the bill, Republican Majority Floor Leader Barbara Sears also took issue with HCFA.  One needn’t look too deep to understand why Sears wouldn’t want the HCFA to pass in Ohio.  Not only has she received a substantial amount of financial contributions from the health care industry, she currently works at a health insurance provider and recently passed her own bill which helps implement Obamacare.

    Representative Sears is currently serving her third term in Ohio and over the years has amassed nearly $1 million in campaign contributions from various members of the health care industry.  In fact, her list of donors is a veritable who’s who of health care heavy hitters including: Humana, Merck & Co, Aetna, United Health Care, Johnson and Johnson and many more national players.  When she’s not representing the people of Ohio (or the health care industry) in the House, she works as the Senior Vice President of Employee Benefits at Roemer Insurance, who's website refers to her as a “resource,” as well as an employee.  Perhaps it was in the spirit of being a “resource” that led Sears to introduce HB 3, a bill that regulates the “navigators” established in Obamacare.

    Navigators will be individuals tasked with helping citizens through the maze of Obamacare before they actually purchase insurance.  According to the federal law, their duties consist of:

     (A) conduct public education activities to raise awareness of the availability of qualified health plans;

    (B) distribute fair and impartial information concerning enrollment in qualified health plans, and the availability of premium tax credits under section 36B of the Internal Revenue Code of 1986 and cost-sharing reductions under section 1402;

    (C) facilitate enrollment in qualified health plans;

    (D) provide referrals to any applicable office of health insurance consumer assistance or health insurance ombudsman established under section 2793 of the Public Health Service Act, or any other appropriate State agency or agencies, for any enrollee with a grievance, complaint, or question regarding their health plan, coverage, or a determination under such plan or coverage; and

    (E) provide information in a manner that is culturally and linguistically appropriate to the needs of the population being served by the Exchange or Exchanges.

    The law, which mandates navigators, specifically bars them from issuing health insurance and directs that funding for the new jobs must come from the state exchanges.  The issue of navigators has been surprisingly absent from the news, considering the large amount of money states will have to come up with to comply with this aspect of Obamacare.  California, notorious for their problems with debt, is slated to spend hundreds of millions of dollars to hire 21,000 navigators.

    So why would a Republican propose a bill that seeks to further regulate a government created job that will cost the states untold amounts of money?  It would appear that insurance brokers across the country are getting nervous about the prospect of competition from navigators and have been lobbying for stricter standards on them.  One such group, the Independent Insurance Agents and Brokers of America, has been lobbying nationwide and their Ohio affiliate has contributed financially to Rep. Sears’ campaigns since 2010.  The passage of Sears’ bill restraining the navigators follows similar bills in Maine and Iowa. In Sears’ case, however, even setting aside the steep amount of money she has received from the health care industry, the fact that she works at an insurance agency that will benefit from her bill seems a conflict of interest.  

    Her motives become even more suspect when considering her follow-up of speaking out against the HCFA, a bill that will protect consumers against federal penalties for noncompliance with Obamacare.  According to Sears, the HCFA violates the state constitution by preventing health insurance providers from selling a product.  “If I pass a law that tells my carriers that if they accept any funding under (Obamacare)... if I comply in that area, my penalty is suspension of my ability to accept new enrollees in the plan,” she stated.  Yet, as Rep. Young explained, the HCFA doesn’t limit insurance providers any more than the refusal by the state to enact its own exchange.  Why would a member of the health insurance industry resist an effort to combat the broad-sweeping mandates of Obamacare?  Does Sears really believe insurance companies won’t be able to sell their products without federal funding?  Perhaps it is the case that she believes insurance companies care more about receiving federal money then the consumers they are supposed to be protecting.  Considering her close ties to the industry, Sears may be, as she did in regulating navigators, representing the concerns of insurance providers instead of Ohioans.

    The HCFA needs Rep. Sears support to continue to passage.  If she is truly representing the people of Ohio and not the health insurance industry and herself, to back a law protecting her constituents makes logical sense.  As the country counts down to Obamacare taking effect, all eyes will be on Ohio to lead the way out of the coming disaster.

    Ask Representative Sears about her motives.


              Forum Post: RE: GP2015R2 Manual payments document number not defaulting from chequebook   
    It seems on doing some forensics on our databases, that this is indeed down to the payables document management module getting switched on somehow - midway through the day on month end. Not amused as I was on holiday and ended up working it instead. Turned debtor document management and payables document managment off, now behaviour in cash receipts and manual payments are both back to how they used to be. Tim.
              Senior Oracle Database Administrator - Wesco Aircraft - Austin, TX   
    Wesco Aircraft is the world’s leading distributor and provider of comprehensive supply chain management services to the global aerospace industry, based on
    From Wesco Aircraft - Mon, 12 Jun 2017 20:10:00 GMT - View all Austin, TX jobs
              Comment on Sen. John Kerry saves a boatload of taxes by mooring new yacht in Rhode Island by Cheap NFL Jerseys Online   
    <strong>Cheap NFL Jerseys Online</strong> You want a site that updates their database at least one time per week.The Indianapolis Colts of course are more than satisfied with their moment in glory. He has wide experience on golf clubs Thomas remembers first going live at the 1996 U. The introd…
              Reporting database by markwade1   
    Develop SQL database for laboratory data. Develop presentable reports in PPT and PDF. Additional functions include utilizing Excel, Word and PDF export of data. Manipulate sql data to graph and construct... (Budget: $3000 - $5000 USD, Jobs: MySQL, PHP)
              Paid summer internship to report on the business of Wall Street   
    InvestmentWires.com is seeking interns who want to learn the Wall Street beat.

    We are a small trade publisher whose readers are the leaders of the mutual fund and 401(k) businesses. We want to teach interns who are eager to learn the ins and outs of business reporting and the financial services business.

    You'll have an opportunity to write, work on our research projects and help with our events.

    Ability to write and write quickly is a must. Familiarity with databases and Web design is an edge.

    This is a paid internship for the summer. To apply, email jobs@investmentwires.com.
              System and Database Administrator - Beaconhouse National University - Lahore   
    Minimum Qualification: BCS/BSCS Certification of OCP and Linux/MCSE with hands on experience will be an added advantage. Experience: At least 3 Years Job...
    From Beaconhouse National University - Wed, 14 Jun 2017 07:24:41 GMT - View all Lahore jobs
              Point - Counterpoint Debate   
    Government students--

    Today you will start your research for the Point - Counterpoint Debate. The SMHS library subscribes to databases that will be extremely helpful in your search for relevant information.

    • Issues and Controversies: before you search, scan Issues in the Headlines, subject index, & Issues: Pros and Cons. Your topic will probably be listed in one of these area. If it isn't, use Advanced Search.
    • Opposing Viewpoints Resource Center: scan popular topics before searching. When you search, decide if you want to search by subject, keyword, or entire document. You can limit your search by reading level (basic, intermediate, & advanced). Start by reading viewpoint articles. If you need background information, look under the reference tab.
    If you have any questions, leave a comment, and I will answer your question on the blog.
              Podcasting in Ms. Pennington's Class - 3rd Period   
    "Students in Ms. Pennington's English classes were given the opportunity to select a book to read for pleasure in a small group (similar to a book club). Mr. Kiely, our librarian, helped students with recommendations and input on how to use the San Mateo County library database to check for availability. While students had to keep informal logs of what they read and checked in with each other weekly to discuss their reading, this project was fairly independent in nature. Ultimately, though, students produced a podcast, with the help of Mr. Kiely. Some students simulated a call-in radio show, writing parts for the author of their book and his or her readers; some students created a trailer for their book in order to entice readers; others imagined the characters from their novel in conversation with each other. All in all, students enjoyed the freedom to choose both a book and the content of their final assignment."

    --Ms. Pennington

    3rd Period Podcasts:
    Click on icon to listen


    Upstate by Kalisha Buckhanon
    Podcast by Elsa, Jocelyn, Rosario, & Jesus




    Twilight by Stephanie Meyer
    Podcast by Andy, Aaron, Raul, & Rafael




    Confessions of an Ugly Stepsister by Gregory Maguire
    Podcast by Marissa, Michelle, Nicole, Alia, & Claire




    King Dork by Frank Portman
    Podcast by Ramez, Yash, & Michael




    Always Running by Luis Rodriguez
    Podcast by Raquel, Dioselin, Viri, Stephanie, Gemma, & Libby



    Creative Commons License


    This work is licensed under a
    Creative Commons Attribution-Noncommercial 3.0 United States License.
              Astronomy   
    Sputnik

    Today in the library you will continue your research on Sputnik from the NY Times' Science section. You will also begin to research a similar modern day event. Make sure to use the library databases for your research.

    Make sure to keep track of your sources. You will need to create a Works Cited slide for your presentation. To cite a podcast or a video from the NY Times site, use the following examples:

    MLA podcast citation



    MLA video citation



    For all other citations, use Noodle Tools. Select the "NoodleBib Express" link and make sure to choose MLA. Noodle Tools will create your works cited entries for you. You will need to do one entry at a time and copy them to your works cited slide or word document.

    Images

    If you want to use images in your PowerPoint presentation, make sure that you use images that are Creative Common licensed. Use this link to search for Creative Common licensed images.

    If you have any questions about citations, images, or PowerPoint, post them to this blog.
              New Library Website   
    Welcome to the new San Mateo High Library website/blog. This site will be updated daily. In the sidebar on the right there are links to the library calendar, the library online catalog, and online tools including subscription databases. If you have a question about anything in a particular post, feel free to leave a comment.

    Teachers: if you would like to schedule time in the library, please check the calendar and then send me an email. If you want to drop in with a class without a scheduled time, check the calendar and give me a call at #2327. If the library is free, you can come right down.

    Students: if you would like to come to the library during tutorial, please get a pass before brunch. I usually limit the number of tutorial passes to 25; they go fast! If you would like to use a computer in the library, you must have a student ID with an Internet sticker. If you don't have it with you, you can not use a computer.

    Check back for library events and updates.
              MSSQL Database Administrator Job - HealthPartners - Richfield, MN   
    Windows, AIX, Linux). We currently have an exciting new opportunity for a MSSQL Database Administrator....
    From HealthPartners - Fri, 02 Jun 2017 06:29:24 GMT - View all Richfield, MN jobs
              Oracle Database Administrator Job - HealthPartners - Richfield, MN   
    Windows, AIX, Linux). We currently have an exciting new opportunity for an Oracle Database Administrator....
    From HealthPartners - Mon, 01 May 2017 21:23:19 GMT - View all Richfield, MN jobs
              Business Intelligence Application Database Administrator (DBA) - Protective Life Corporation - Birmingham, AL   
    Handles inbound database service requests from the Business Intelligence team, providing priority response....
    From Protective Life Corporation - Thu, 20 Apr 2017 18:31:04 GMT - View all Birmingham, AL jobs
              Program Coordinator, FDA and Third Party Tobacco Retail Inspection Program - JBS International, Inc. - Maryland   
    Experience in conducting data analysis, preferably using SQL server databases. A Bachelor's degree, in business administration, public administration, criminal...
    From JBS International, Inc. - Fri, 02 Jun 2017 03:23:28 GMT - View all Maryland jobs
              Web Application Developer / Backend Programmer - Marketing Results - Henderson, NV   
    Familiarity with server/network architecture is a plus. Marketing Results is a gaming industry pioneer in high-tech consumer database marketing....
    From Indeed - Thu, 15 Jun 2017 21:31:12 GMT - View all Henderson, NV jobs
              Senior Oracle Database Administrator - Wesco Aircraft - Austin, TX   
    The Lead Oracle Database Administrator (DBA) will be responsible for leading the Oracle database administration team to provide the design, implementation and...
    From Wesco Aircraft - Mon, 12 Jun 2017 20:10:00 GMT - View all Austin, TX jobs
              E-mail Marketeer (32- 40 uur) - IWR hoofdkantoor - Emst   
    Samen met de E-Commerce Manager bepaal je de bijbehorende strategie en zoek je naar kansen en mogelijkheden om de database te vergroten en gasten te overtuigen...
    Van International Wellness Resorts - Wed, 07 Jun 2017 13:08:57 GMT - Toon alle vacatures in Emst
              FAQ   
    Akismet checks your comments and contact form submissions against our global database of spam to protect you and your site from malicious content.
              Business Intelligence Developer - Pinnacle Partners - Indianapolis, IN   
    Bachelor’s degreeand 5+ years of professional experience.An understanding of relational database and business intelligence conceptsKnowledge of MS SQL Server,... $90,000 a year
    From Pinnacle Partners - Thu, 18 May 2017 18:18:17 GMT - View all Indianapolis, IN jobs
              Manager, Financial Planning and Analysis - ABBOTT LABORATORIES - Sylmar, CA   
    Demonstrated familiarity/experience with a Multidimensional Database Management System (MDBMS) such as Hyperion Essbase....
    From Abbott Laboratories - Sat, 01 Jul 2017 10:36:03 GMT - View all Sylmar, CA jobs
              Finance Manager - Business Process Improvement Project Manager - Verizon - Basking Ridge, NJ   
    Knowledge in the usage of Essbase, Smart View, Hyperion Planning, Hyperion Strategic Finance, Oracle DRM, Collibra, and relational database applications and...
    From Verizon - Thu, 29 Jun 2017 10:58:19 GMT - View all Basking Ridge, NJ jobs
              Database Administrator / Architect - Morgan Stanley - New York, NY   
    Datawarehouse and Big data lake architect , supporting an enterprise datawarehouse platform designed and developed to store , process, curate and distribute the...
    From Morgan Stanley - Wed, 22 Mar 2017 15:13:32 GMT - View all New York, NY jobs
              Teradata Database Management Analyst - JP Morgan Chase - Columbus, OH   
    Position is for Production support for a mature datawarehouse which follows regular ticketing, change processes. JPMorgan Chase &amp; Co....
    From JPMorgan Chase - Tue, 30 May 2017 10:38:07 GMT - View all Columbus, OH jobs
              Azure With ADL/ADW/ADF - Sonsoft Inc - Redmond, WA   
    Strong technical background with knowledge of BI foundational concepts such as relational database, datawarehouse, data mart....
    From Sonsoft Inc - Sat, 24 Jun 2017 03:08:32 GMT - View all Redmond, WA jobs
              Using fromArray() to Write Raw Values to the DB   
    In the previous article, we saw a way to write raw values to the MODX database with PDO. In this one, we’ll see a slightly slower, but much more convenient …
              LINQ编译到CIL的方法   

    Linq相关:

    Linq是以下列方式编译:

    1. 首先,LINQ 查询表达式转换为方法调用:

      public static void Main() {     var query = db.Cars.Select<Car, Car>(c => c);     foreach (Car aCar in query)     {          Console.WriteLine(aCar.Name);     } } 
    2. 如果 db.Cars 的类型是 IEnumerable<Car> (即它的 LINQ,对象),然后 lambda 表达式变成了一个单独的方法:

      private Car lambda0(Car c) {     return c; } private Func<Car, Car> CachedAnonymousMethodDelegate1; public static void Main() {     if (CachedAnonymousMethodDelegate1 == null)         CachedAnonymousMethodDelegate1 = new Func<Car, Car>(lambda0);     var query = db.Cars.Select<Car, Car>(CachedAnonymousMethodDelegate1);     foreach // ... } 

      在现实中不调用该方法 lambda0 但东西喜欢 <Main>b__0 (在 Main 是包含方法的名称)。同样,缓存的委托实际上调用 CS$<>9__CachedAnonymousMethodDelegate1

      如果您正在使用 SQL LINQ 然后 db.Cars 类型将为 IQueryable<Car>,此步骤是非常不同。相反,它会到表达式目录树的 lambda 表达式:

      public static void Main() {     var parameter = Expression.Parameter(typeof(Car), "c");     var lambda = Expression.Lambda<Func<Car, Car>>(parameter, new ParameterExpression[] { parameter }));     var query = db.Cars.Select<Car, Car>(lambda);     foreach // ... } 
    3. foreach 循环变成一个 try/finally 块 (这是相同的两个):

      IEnumerator<Car> enumerator = null; try {     enumerator = query.GetEnumerator();     Car aCar;     while (enumerator.MoveNext())     {         aCar = enumerator.Current;         Console.WriteLine(aCar.Name);     } } finally {     if (enumerator != null)         ((IDisposable)enumerator).Dispose(); } 
    4. 最后,这是为 IL 编译预期的方式。以下是为 IEnumerable<Car>

      // Put db.Cars on the stack L_0016: ldloc.0  L_0017: callvirt instance !0 DatabaseContext::get_Cars()   // “if” starts here L_001c: ldsfld Func<Car, Car> Program::CachedAnonymousMethodDelegate1 L_0021: brtrue.s L_0034 L_0023: ldnull  L_0024: ldftn Car Program::lambda0(Car) L_002a: newobj instance void Func<Car, Car>::.ctor(object, native int) L_002f: stsfld Func<Car, Car> Program::CachedAnonymousMethodDelegate1   // Put the delegate for “c => c” on the stack L_0034: ldsfld Func<Car, Car> Program::CachedAnonymousMethodDelegate1   // Call to Enumerable.Select() L_0039: call IEnumerable<!!1> Enumerable::Select<Car, Car>(IEnumerable<!!0>, Func<!!0, !!1>) L_003e: stloc.1   // “try” block starts here L_003f: ldloc.1  L_0040: callvirt instance IEnumerator<!0> IEnumerable<Car>::GetEnumerator() L_0045: stloc.3   // “while” inside try block starts here L_0046: br.s L_005a L_0048: ldloc.3   // body of while starts here L_0049: callvirt instance !0 IEnumerator<Car>::get_Current() L_004e: stloc.2  L_004f: ldloc.2  L_0050: ldfld string Car::Name L_0055: call void Console::WriteLine(string) L_005a: ldloc.3   // while condition starts here L_005b: callvirt instance bool IEnumerator::MoveNext() L_0060: brtrue.s L_0048  // end of while L_0062: leave.s L_006e   // end of try   // “finally” block starts here L_0064: ldloc.3  L_0065: brfalse.s L_006d L_0067: ldloc.3  L_0068: callvirt instance void IDisposable::Dispose() L_006d: endfinally  

      已编译的代码,IQueryable<Car> 版本是按预期也。这里是重要的部分,有别于上述 (本地变量会有不同的偏移和名称现在,但让我们忽略的):

      // typeof(Car) L_0021: ldtoken Car L_0026: call Type Type::GetTypeFromHandle(RuntimeTypeHandle)   // Expression.Parameter(typeof(Car), "c") L_002b: ldstr "c" L_0030: call ParameterExpression Expression::Parameter(Type, string) L_0035: stloc.3    // Expression.Lambda(...) L_0036: ldloc.3  L_0037: ldc.i4.1           // var paramArray = new ParameterExpression[1] L_0038: newarr ParameterExpression L_003d: stloc.s paramArray L_003f: ldloc.s paramArray L_0041: ldc.i4.0                    // paramArray[0] = parameter; L_0042: ldloc.3  L_0043: stelem.ref  L_0044: ldloc.s paramArray L_0046: call Expression<!!0> Expression::Lambda<Func<Car, Car>>(Expression, ParameterExpression[])   // var query = Queryable.Select(...); L_004b: call IQueryable<!!1> Queryable::Select<Car, Car>(IQueryable<!!0>, Expression<Func<!!0, !!1>>) L_0050: stloc.1  


    墙头草 2013-02-06 14:49 发表评论

              FAQ   
    Akismet checks your comments and contact form submissions against our global database of spam to protect you and your site from malicious content.
              SQL Database Administrator 3   

              Still not Filed an Income Tax Return ? Here is the Step what to Do ?   

    File All Tax Returns

    Taxpayers should file all tax returns that are due, regardless of whether or not full payment can be made with the return. Depending on an individual’s circumstances, a taxpayer filing late may qualify for a payment plan. All payment plans require continued compliance with all filing and payment responsibilities after the plan is approved.

    Facts About Filing Tax Returns

    • Failure to file a return or filing late can be costly. If taxes are owed, a delay in filing may result in penalty and interest charges that could increase your tax bill by 25 percent or more.
    • There is no penalty for failure to file a tax return if a refund is due. But by waiting too long to file, you can lose your refund. In order to receive a refund, the return must be filed within 3 years of the due date. If you file a return, and later realize you made an error on the return, the deadline for claiming any refund due is three years after the return was filed, or two years after the tax was paid, whichever expires later.
    • Taxpayers who are entitled to the Earned Income Tax Credit must file a return to claim the credit even if they are not otherwise required to file. The return must be filed within 3 years of the due date in order to receive the credit.
    • If you are self-employed, you must file returns reporting self-employment income within three years of the due date in order to receive Social Security credits toward your retirement.

    NOTE: Taxpayers who continue to not file a required return and fail to respond to IRS requests for a return may be considered for a variety of enforcement actions. Continued non-compliance by flagrant or repeat nonfilers could result in additional penalties and/or criminal prosecution.

    Getting Free Help to File Late Returns

    The IRS offers free assistance by computer, telephone, facsimile and in person. The IRS can assist taxpayers with obtaining forms, publications, and answers to a wide range of tax questions.

    If you are a wage-earner, and have misplaced your W-2 Forms showing your income and income tax withholding, and you are unable to obtain duplicate copies from your employer, IRS can often provide you with that information after the annual matching programs are run. Matching programs are run after filing season, and the information is usually available in late August of the year in which the tax return is due.

    If you think your employer did not report your wages, contact IRS for assistance on how to file your tax returns. If you can establish that your employer withheld taxes on your salary (normally by providing pay stubs), you will receive credit for your social security and income tax withholding even if IRS did not receive the withheld tax. If your employer failed to withhold the taxes, you must still file your return.

    The Volunteer Income Tax Assistance (VITA) program and IRS e-file joined forces several years ago to bring electronic tax filing to VITA sites. Since then, volunteers prepare tax returns on computers and ultimately transmit them electronically to the IRS. It’s free of charge for individuals of low to moderate income.

    Individuals and joint filers whose income exceeds VITA program criteria, as well as businesses (i.e. filing Schedules C and E) should seek professional assistance for return preparation. The Authorized IRS e-file Provider database is a nationwide listing of all businesses that have been accepted to participate in the electronic filing (IRS e-file) program.

    Documents Required to Prepare a Return

    In order for the IRS to assist with preparing a tax return, taxpayers should bring any and all information related to income and deductions for the tax years for which a return is required to be filed. Some of the documents may include:

    • Forms W-2 – Forms from employers showing wages for the year.
    • Forms 1099 – Forms from banks and other financial institutions showing interest and dividends. Forms 1099 also report self-employment income.
    • Information on expenses to claim on the return, such as itemized deductions, child care expenses, or employee business expenses.
    • Social Security numbers for dependent children and any other person claimed as a dependent
    • A copy of the last tax return filed.

              QA Tester - Noviur Technologies - Vaughan, ON   
    Extracting data from database to cross reference against expected results within test scripts Design, develop and maintain test plan and test cases utilizing...
    From Noviur Technologies - Thu, 08 Jun 2017 04:21:22 GMT - View all Vaughan, ON jobs
              Workplace Service Operation Analyst - BMW do Brasil - Araquari, SC   
    You will also be responsible for the leadership and execution of the Prima &amp; CMDB databases, Group standards for print/file and Client SW services, and the Run...
    De BMW do Brasil - Wed, 15 Mar 2017 13:40:34 GMT - Visualizar todas as empregos: Araquari, SC
              Knowledge E added to SHERPA/RoMEO database   
    has announced that all its copyright policies are now featured in the SHERPA/RoMEO database. All KnE conference proceedings and journals articles are covered under policies classified as 'RoMEO Green Publisher' status as displayed on the SHERPA/RoMEO database effective immediately.
              The State Geologic Map Compilation (SGMC) geodatabase of the conterminous United States   

    The State Geologic Map Compilation (SGMC) geodatabase of the conterminous United States (https://doi.org/10.5066/F7WH2N65) represents a seamless, spatial database of 48 State geologic maps that range from 1:50,000 to 1:1,000,000 scale. A national digital geologic map database is essential in interpreting other datasets that support numerous types of national-scale studies and assessments, such as those that provide geochemistry, remote sensing, or geophysical data. The SGMC is a compilation of the individual U.S. Geological Survey releases of the Preliminary Integrated Geologic Map Databases for the United States. The SGMC geodatabase also contains updated data for six States and seven entirely new State geologic maps that have been added since the preliminary databases were published. Numerous errors have been corrected and enhancements added to the preliminary datasets using thorough quality assurance/quality control procedures. The SGMC is not a truly integrated geologic map database because geologic units have not been reconciled across State boundaries. However, the geologic data contained in each State geologic map have been standardized to allow spatial analyses of lithology, age, and stratigraphy at a national scale


              A hybrid machine learning model to predict and visualize nitrate concentration throughout the Central Valley aquifer, California, USA   

    Intense demand for water in the Central Valley of California and related increases in groundwater nitrate concentration threaten the sustainability of the groundwater resource. To assess contamination risk in the region, we developed a hybrid, non-linear, machine learning model within a statistical learning framework to predict nitrate contamination of groundwater to depths of approximately 500 m below ground surface. A database of 145 predictor variables representing well characteristics, historical and current field and landscape-scale nitrogen mass balances, historical and current land use, oxidation/reduction conditions, groundwater flow, climate, soil characteristics, depth to groundwater, and groundwater age were assigned to over 6000 private supply and public supply wells measured previously for nitrate and located throughout the study area. The boosted regression tree (BRT) method was used to screen and rank variables to predict nitrate concentration at the depths of domestic and public well supplies. The novel approach included as predictor variables outputs from existing physically based models of the Central Valley. The top five most important predictor variables included two oxidation/reduction variables (probability of manganese concentration to exceed 50 ppb and probability of dissolved oxygen concentration to be below 0.5 ppm), field-scale adjusted unsaturated zone nitrogen input for the 1975 time period, average difference between precipitation and evapotranspiration during the years 1971–2000, and 1992 total landscape nitrogen input. Twenty-five variables were selected for the final model for log-transformed nitrate. In general, increasing probability of anoxic conditions and increasing precipitation relative to potential evapotranspiration had a corresponding decrease in nitrate concentration predictions. Conversely, increasing 1975 unsaturated zone nitrogen leaching flux and 1992 total landscape nitrogen input had an increasing relative impact on nitrate predictions. Three-dimensional visualization indicates that nitrate predictions depend on the probability of anoxic conditions and other factors, and that nitrate predictions generally decreased with increasing groundwater age.


              Status and understanding of groundwater quality in the Bear Valley and Lake Arrowhead Watershed Study Unit, 2010: California GAMA Priority Basin Project   

    Groundwater quality in the 112-square-mile Bear Valley and Lake Arrowhead Watershed (BEAR) study unit was investigated as part of the Priority Basin Project (PBP) of the Groundwater Ambient Monitoring and Assessment (GAMA) Program. The study unit comprises two study areas (Bear Valley and Lake Arrowhead Watershed) in southern California in San Bernardino County. The GAMA-PBP is conducted by the California State Water Resources Control Board (SWRCB) in cooperation with the U.S. Geological Survey (USGS) and the Lawrence Livermore National Laboratory.

    The GAMA BEAR study was designed to provide a spatially balanced, robust assessment of the quality of untreated (raw) groundwater from the primary aquifer systems in the two study areas of the BEAR study unit. The assessment is based on water-quality collected by the USGS from 38 sites (27 grid and 11 understanding) during 2010 and on water-quality data from the SWRCB-Division of Drinking Water (DDW) database. The primary aquifer system is defined by springs and the perforation intervals of wells listed in the SWRCB-DDW water-quality database for the BEAR study unit.

    This study included two types of assessments: (1) a status assessment, which characterized the status of the quality of the groundwater resource as of 2010 by using data from samples analyzed for volatile organic compounds, pesticides, and naturally present inorganic constituents, such as major ions and trace elements, and (2) an understanding assessment, which evaluated the natural and human factors potentially affecting the groundwater quality. The assessments were intended to characterize the quality of groundwater resources in the primary aquifer system of the BEAR study unit, not the treated drinking water delivered to consumers. Bear Valley study area and the Lake Arrowhead Watershed study area were also compared statistically on the basis of water-quality results and factors potentially affecting the groundwater quality.

    Relative concentrations (RCs), which are sample concentration of a particular constituent divided by its associated health- or aesthetic-based benchmark concentrations, were used for evaluating the groundwater quality for those constituents that have Federal or California regulatory or non-regulatory benchmarks for drinking-water quality. An RC greater than 1.0 indicates a concentration greater than a benchmark. Organic (volatile organic compounds and pesticides) and special-interest (perchlorate) constituent RCs were classified as “high” (RC greater than 1.0), “moderate” (RC less than or equal to 1.0 and greater than 0.1), or “low” (RC less than or equal to 0.1). For inorganic (radioactive, trace element, major ion, and nutrient) constituents, the boundary between low and moderate RCs was set at 0.5.

    Aquifer-scale proportion was used as the primary metric in the status assessment for evaluating groundwater quality at the study-unit scale or for its component areas. High aquifer-scale proportion was defined as the percentage of the area of the primary aquifer system with a RC greater than 1.0 for a particular constituent or class of constituents; the percentage is based on area rather than volume. Moderate and low aquifer-scale proportions were defined as the percentage of the primary aquifer system with moderate and low RCs, respectively. A spatially weighted statistical approach was used to evaluate aquifer-scale proportions for individual constituents and classes of constituents.

    The status assessment for the Bear Valley study area found that inorganic constituents with health-based benchmarks were detected at high RCs in 9.0 percent of the primary aquifer system and at moderate RCs in 13 percent. The high RCs of inorganic constituents primarily reflected high aquifer-scale proportions of fluoride (in 5.4 percent of the primary aquifer system) and arsenic (3.6 percent). The RCs of organic constituents with health-based benchmarks were high in 1.0 percent of the primary aquifer system, moderate in 8.1 percent, and low in 70 percent. Organic constituents were detected in 79 percent of the primary aquifer system. Two groups of organic constituents and two individual organic constituents were detected at frequencies greater than 10 percent of samples from the USGS grid sites: trihalomethanes (THMs), solvents, methyl tert-butyl ether (MTBE), and simazine. The special-interest constituent perchlorate was detected in 93 percent of the primary aquifer system; it was detected at moderate RCs in 7.1 percent and at low RCs in 86 percent.

    The status assessment in the Lake Arrowhead Watershed study area showed that inorganic constituents with human-health benchmarks were detected at high RCs in 25 percent of the primary aquifer system and at moderate RCs in 41 percent. The high aquifer-scale proportion of inorganic constituents primarily reflected high aquifer-scale proportions of radon‑222 (in 62 percent of the primary aquifer system) and uranium (26 percent). RCs of organic constituents with health-based benchmarks were moderate in 7.7 percent of the primary aquifer system and low in 46 percent. Organic constituents were detected in 54 percent of the primary aquifer system. The only organic constituents that were detected at frequencies greater than 10 percent of samples from the USGS grid sites were THMs. Perchlorate was detected in 62 percent of the primary aquifer system at uniformly low RCs.

    The second component of this study, the understanding assessment, identified the natural and human factors that could have affected the groundwater quality in the BEAR study unit by evaluating statistical correlations between water-quality constituents and potential explanatory factors. The potential explanatory factors evaluated were land use (including density of septic tanks and leaking or formerly leaking underground fuel tanks), site type, aquifer lithology, well construction (well depth and depth to the top-of-perforated interval), elevation, aridity index, groundwater-age distribution, and oxidation-reduction condition (including pH and dissolved oxygen concentration). Results of the statistical evaluations were used to explain the distribution of constituents in groundwater of the BEAR study unit.

    In the Bear Valley study area, high and moderate RCs of fluoride were found in sites known to be influenced by hydrothermic conditions or that had high concentrations of fluoride historically. The high RC of arsenic can likely be attributed to desorption of arsenic from aquifer sediments saturated in old groundwater with high pH under reducing conditions. The THMs were detected more frequently at USGS grid sites that were wells, part of a large urban water system, and surrounded by urban land use. Solvents, MTBE, and simazine were all detected more frequently at USGS grid sites that were wells with a greater urban percentage of surrounding land use and that accessed older groundwater than other USGS grid sites. Comparison between the observed and predicted detection frequencies of perchlorate at USGS grid sites indicated that anthropogenic sources could have contributed to low levels of perchlorate in the groundwater of the Bear Valley study area.

    In the Lake Arrowhead Watershed study area, high and moderate RCs of radon-222 and uranium can be attributed to older groundwater from the granitic fractured-rock primary aquifer system. Low RCs of THMs were detected at USGS grid sites that were wells and part of small water systems. The similarities between the observed and predicted detection frequencies of perchlorate in samples from USGS grid sites indicated that the source and distribution of perchlorate were most likely attributable to precipitation (rain and snow), with minimal, if any, contribution from anthropogenic sources.


              Reducing risk where tectonic plates collide—U.S. Geological Survey subduction zone science plan   

    The U.S. Geological Survey (USGS) serves the Nation by providing reliable scientific information and tools to build resilience in communities exposed to subduction zone earthquakes, tsunamis, landslides, and volcanic eruptions. Improving the application of USGS science to successfully reduce risk from these events relies on whole community efforts, with continuing partnerships among scientists and stakeholders, including researchers from universities, other government labs and private industry, land-use planners, engineers, policy-makers, emergency managers and responders, business owners, insurance providers, the media, and the general public.

    Motivated by recent technological advances and increased awareness of our growing vulnerability to subduction-zone hazards, the USGS is uniquely positioned to take a major step forward in the science it conducts and products it provides, building on its tradition of using long-term monitoring and research to develop effective products for hazard mitigation. This science plan provides a blueprint both for prioritizing USGS science activities and for delineating USGS interests and potential participation in subduction zone science supported by its partners.

    The activities in this plan address many USGS stakeholder needs:

    • High-fidelity tools and user-tailored information that facilitate increasingly more targeted, neighborhood-scale decisions to mitigate risks more cost-effectively and ensure post-event operability. Such tools may include maps, tables, and simulated earthquake ground-motion records conveying shaking intensity and frequency. These facilitate the prioritization of retrofitting of vulnerable infrastructure;
    • Information to guide local land-use and response planning to minimize development in likely hazardous zones (for example, databases, maps, and scenario documents to guide evacuation route planning in communities near volcanoes, along coastlines vulnerable to tsunamis, and built on landslide-prone terrain);
    • New tools to assess the potential for cascading hazards, such as landslides, tsunamis, coastal changes, and flooding caused by earthquakes or volcanic eruptions;
    • Geospatial models of permanent, widespread land- and sea-level changes that may occur in the immediate aftermath of great (M ≥8.0) subduction zone earthquakes;
    • Strong partnerships between scientists and public safety providers for effective decision making during periods of elevated hazard and risk;
    • Accurate forecasts of far-reaching hazards (for example, ash clouds, tsunamis) to avert catastrophes and unnecessary disruptions in air and sea transportation;
    • Aftershock forecasts to guide decisions about when and where to re-enter, repair, or rebuild buildings and infrastructure, for all types of subduction zone earthquakes.

              The finite, kinematic rupture properties of great-sized earthquakes since 1990   

    Here, I present a database of >160 finite fault models for all earthquakes of M 7.5 and above since 1990, created using a consistent modeling approach. The use of a common approach facilitates easier comparisons between models, and reduces uncertainties that arise when comparing models generated by different authors, data sets and modeling techniques.

    I use this database to verify published scaling relationships, and for the first time show a clear and intriguing relationship between maximum potency (the product of slip and area) and average potency for a given earthquake. This relationship implies that earthquakes do not reach the potential size given by the tectonic load of a fault (sometimes called “moment deficit,” calculated via a plate rate over time since the last earthquake, multiplied by geodetic fault coupling). Instead, average potency (or slip) scales with but is less than maximum potency (dictated by tectonic loading). Importantly, this relationship facilitates a more accurate assessment of maximum earthquake size for a given fault segment, and thus has implications for long-term hazard assessments. The relationship also suggests earthquake cycles may not completely reset after a large earthquake, and thus repeat rates of such events may appear shorter than is expected from tectonic loading. This in turn may help explain the phenomenon of “earthquake super-cycles” observed in some global subduction zones.


              Elevation Difference and Bouguer Anomaly Analysis Tool (EDBAAT) User's Guide   

    This report describes a software tool that imports gravity anomaly point data from the Gravity Database of the United States (GDUS) of the National Geospatial-Intelligence Agency and University of Texas at El Paso along with elevation data from The National Map (TNM) of the U.S. Geological Survey that lie within a user-specified geographic area of interest. Further, the tool integrates these two sets of data spatially and analyzes the consistency of the elevation of each gravity station from the GDUS with TNM elevation data; it also evaluates the consistency of gravity anomaly data within the GDUS data repository. The tool bins the GDUS data based on user-defined criteria of elevation misfit between the GDUS and TNM elevation data. It also provides users with a list of points from the GDUS data, which have Bouguer anomaly values that are considered outliers (two standard deviations or greater) with respect to other nearby GDUS anomaly data. “Nearby” can be defined by the user at time of execution. These outputs should allow users to quickly and efficiently choose which points from the GDUS would be most useful in reconnaissance studies or in augmenting and extending the range of individual gravity studies.


              Groundwater resources of the Devils Postpile National Monument—Current conditions and future vulnerabilities   

    This study presents an extensive database on groundwater conditions in and around Devils Postpile National Monument. The database contains chemical analyses of springs and the monument water-supply well, including major-ion chemistry, trace element chemistry, and the first information on a list of organic compounds known as emerging contaminants. Diurnal, seasonal, and annual variations in groundwater discharge and chemistry are evaluated from data collected at five main monitoring sites, where streams carry the aggregate flow from entire groups of springs. These springs drain the Mammoth Mountain area and, during the fall months, contribute a significant fraction of the San Joaquin River flow within the monument. The period of this study, from fall 2012 to fall 2015, includes some of the driest years on record, though the seasonal variability observed in 2013 might have been near normal. The spring-fed streams generally flowed at rates well below those observed during a sequence of wet years in the late 1990s. However, persistence of flow and reasonably stable water chemistry through the recent dry years are indicative of a sizeable groundwater system that should provide a reliable resource during similar droughts in the future. Only a few emerging contaminants were detected at trace levels below 1 microgram per liter (μg/L), suggesting that local human visitation is not degrading groundwater quality. No indication of salt from the ski area on the north side of Mammoth Mountain could be found in any of the groundwaters. Chemical data instead show that natural mineral water, such as that discharged from local soda springs, is the main source of anomalous chloride in the monument supply well and in the San Joaquin River. The results of the study are used to develop a set of recommendations for future monitoring to enable detection of deleterious impacts to groundwater quality and quantity


              Expanding the North American Breeding Bird Survey analysis to include additional species and regions   

    The North American Breeding Bird Survey (BBS) contains data for >700 bird species, but analyses often focus on a core group of ∼420 species. We analyzed data for 122 species of North American birds for which data exist in the North American Breeding Bird Survey (BBS) database but are not routinely analyzed on the BBS Summary and Analysis Website. Many of these species occur in the northern part of the continent, on routes that fall outside the core survey area presently analyzed in the United States and southern Canada. Other species not historically analyzed occur in the core survey area with very limited data but have large portions of their ranges in Mexico and south. A third group of species not historically analyzed included species thought to be poorly surveyed by the BBS, such as rare, coastal, or nocturnal species. For 56 species found primarily in regions north of the core survey area, we expanded the scope of the analysis, using data from 1993 to 2014 during which ≥3 survey routes had been sampled in 6 northern strata (Bird Conservation regions in Alaska, Yukon, and Newfoundland and Labrador) and fitting log-linear hierarchical models for an augmented BBS survey area that included both the new northern strata and the core survey area. We also applied this model to 168 species historically analyzed in the BBS that had data from these additional northern strata. For both groups of species we calculated survey-wide trends for the both core and augmented survey areas from 1993 to 2014; for species that did not occur in the newly defined strata, we computed trends from 1966 to 2014. We evaluated trend estimates in terms of established credibility criteria for BBS results, screening for imprecise trends, small samples, and low relative abundance. Inclusion of data from the northern strata permitted estimation of trend for 56 species not historically analyzed, but only 4 of these were reasonably monitored and an additional 13 were questionably monitored; 39 of these species were likely poorly monitored because of small numbers of samples or very imprecisely estimated trends. Only 4 of 66 “new” species found in the core survey area were reasonably monitored by the BBS; 20 were questionably monitored; and 42 were likely poorly monitored by the BBS because of inefficiency in precision, abundance, or sample size. The hierarchical analyses we present provide a means for reasonable inclusion of the additional species and strata in a common analysis with data from the core area, a critical step in the evolution of the BBS as a continent-scale survey. We recommend that results be presented both 1) from 1993 to the present using the expanded survey area, and 2) from 1966 to the present for the core survey area. Although most of the “new” species we analyzed were poorly monitored by the BBS during 1993–2014, continued expansion of the BBS will improve the quality of information in future analyses for these species and for the many other species presently monitored by the BBS.


              Five hydrologic and landscape databases for selected National Wildlife Refuges in the Southeastern United States   

    This report serves as metadata and a user guide for five out of six hydrologic and landscape databases developed by the U.S. Geological Survey, in cooperation with the U.S. Fish and Wildlife Service, to describe data-collection, data-reduction, and data-analysis methods used to construct the databases and provides statistical and graphical descriptions of the databases. Six hydrologic and landscape databases were developed: (1) the Cache River and White River National Wildlife Refuges (NWRs) and contributing watersheds in Arkansas, Missouri, and Oklahoma, (2) the Cahaba River NWR and contributing watersheds in Alabama, (3) the Caloosahatchee and J.N. “Ding” Darling NWRs and contributing watersheds in Florida, (4) the Clarks River NWR and contributing watersheds in Kentucky, Tennessee, and Mississippi, (5) the Lower Suwannee NWR and contributing watersheds in Georgia and Florida, and (6) the Okefenokee NWR and contributing watersheds in Georgia and Florida. Each database is composed of a set of ASCII files, Microsoft Access files, and Microsoft Excel files. The databases were developed as an assessment and evaluation tool for use in examining NWR-specific hydrologic patterns and trends as related to water availability and water quality for NWR ecosystems, habitats, and target species. The databases include hydrologic time-series data, summary statistics on landscape and hydrologic time-series data, and hydroecological metrics that can be used to assess NWR hydrologic conditions and the availability of aquatic and riparian habitat. Landscape data that describe the NWR physiographic setting and the locations of hydrologic data-collection stations were compiled and mapped. Categories of landscape data include land cover, soil hydrologic characteristics, physiographic features, geographic and hydrographic boundaries, hydrographic features, and regional runoff estimates. The geographic extent of each database covers an area within which human activities, climatic variation, and hydrologic processes can potentially affect the hydrologic regime of the NWRs and adjacent areas.

    The hydrologic and landscape database for the Cache and White River NWRs and contributing watersheds in Arkansas, Missouri, and Oklahoma has been described and documented in detail (Buell and others, 2012). This report serves as a companion to the Buell and others (2012) report to describe and document the five subsequent hydrologic and landscape databases that were developed: Chapter A—the Cahaba River NWR and contributing watersheds in Alabama, Chapter B—the Caloosahatchee and J.N. “Ding” Darling NWRs and contributing watersheds in Florida, Chapter C—the Clarks River NWR and contributing watersheds in Kentucky, Tennessee, and Mississippi, Chapter D—the Lower Suwannee NWR and contributing watersheds in Georgia and Florida, and Chapter E—the Okefenokee NWR and contributing watersheds in Georgia and Florida.


              Groundwater quality in the Western San Joaquin Valley study unit, 2010: California GAMA Priority Basin Project   

    Water quality in groundwater resources used for public drinking-water supply in the Western San Joaquin Valley (WSJV) was investigated by the USGS in cooperation with the California State Water Resources Control Board (SWRCB) as part of its Groundwater Ambient Monitoring and Assessment (GAMA) Program Priority Basin Project. The WSJV includes two study areas: the Delta–Mendota and Westside subbasins of the San Joaquin Valley groundwater basin. Study objectives for the WSJV study unit included two assessment types: (1) a status assessment yielding quantitative estimates of the current (2010) status of groundwater quality in the groundwater resources used for public drinking water, and (2) an evaluation of natural and anthropogenic factors that could be affecting the groundwater quality. The assessments characterized the quality of untreated groundwater, not the quality of treated drinking water delivered to consumers by water distributors.

    The status assessment was based on data collected from 43 wells sampled by the U.S. Geological Survey for the GAMA Priority Basin Project (USGS-GAMA) in 2010 and data compiled in the SWRCB Division of Drinking Water (SWRCB-DDW) database for 74 additional public-supply wells sampled for regulatory compliance purposes between 2007 and 2010. To provide context, concentrations of constituents measured in groundwater were compared to U.S. Environmental Protection Agency (EPA) and SWRCB-DDW regulatory and non-regulatory benchmarks for drinking-water quality. The status assessment used a spatially weighted, grid-based method to estimate the proportion of the groundwater resources used for public drinking water that has concentrations for particular constituents or class of constituents approaching or above benchmark concentrations. This method provides statistically unbiased results at the study-area scale within the WSJV study unit, and permits comparison of the two study areas to other areas assessed by the GAMA Priority Basin Project statewide.

    Groundwater resources used for public drinking water in the WSJV study unit are among the most saline and most affected by high concentrations of inorganic constituents of all groundwater resources used for public drinking water that have been assessed by the GAMA Priority Basin Project statewide. Among the 82 GAMA Priority Basin Project study areas statewide, the Delta–Mendota study area ranked above the 90th percentile for aquifer-scale proportions of groundwater resources having concentrations of total dissolved solids (TDS), sulfate, chloride, manganese, boron, chromium(VI), selenium, and strontium above benchmarks, and the Westside study area ranked above the 90th percentile for TDS, sulfate, manganese, and boron.

    In the WSJV study unit as a whole, one or more inorganic constituents with regulatory or non-regulatory, health-based benchmarks were present at concentrations above benchmarks in about 53 percent of the groundwater resources used for public drinking water, and one or more organic constituents with regulatory health-based benchmarks were detected at concentrations above benchmarks in about 3 percent of the resource. Individual constituents present at concentrations greater than health-based benchmarks in greater than 2 percent of groundwater resources used for public drinking water included: boron (51 percent, SWRCB-DDW notification level), chromium(VI) (25 percent, SWRCB-DDW maximum contaminant level (MCL)), arsenic (10 percent, EPA MCL), strontium (5.1 percent, EPA Lifetime health advisory level (HAL)), nitrate (3.9 percent, EPA MCL), molybdenum (3.8 percent, EPA HAL), selenium (2.6 percent, EPA MCL), and benzene (2.6 percent, SWRCB-DDW MCL). In addition, 50 percent of the resource had TDS concentrations greater than non-regulatory, aesthetic-based SWRCB-DDW upper secondary maximum contaminant level (SMCL), and 44 percent had manganese concentrations greater than the SWRCB-DDW SMCL.

    Natural and anthropogenic factors that could affect the groundwater quality were evaluated by using results from statistical testing of associations between constituent concentrations and values of potential explanatory factors, inferences from geochemical and age-dating tracer results, and by considering the water-quality results in the context of the hydrogeologic setting of the WSJV study unit.

    Natural factors, particularly the lithologies of the source areas for groundwater recharge and of the aquifers, were the dominant factors affecting groundwater quality in most of the WSJV study unit. However, where groundwater resources used for public supply included groundwater recharged in the modern era, mobilization of constituents by recharge of water used for irrigation also affected groundwater quality. Public-supply wells in the Westside study area had a median depth of 305 m and primarily tapped groundwater recharged hundreds to thousands of years ago, whereas public-supply wells in the Delta–Mendota study area had a median depth of 85 m and primarily tapped either groundwater recharged within the last 60 years or groundwater consisting of mixtures of this modern recharge and older recharge.

    Public-supply wells in the WSJV study unit are screened in the Tulare Formation and zones above and below the Corcoran Clay Member are used. The Tulare Formation primarily consists of alluvial sediments derived from the Coast Ranges to the west, except along the valley trough at the eastern margin of the WSJV study unit where the Tulare Formation consists of fluvial sands derived from the Sierra Nevada to the east. Groundwater from wells screened in the Sierra Nevada sands had manganese-reducing or manganese- and iron-reducing oxidation-reduction (redox) conditions. These redox conditions commonly were associated with elevated arsenic or molybdenum concentrations, and the dominance of arsenic(III) in the dissolved arsenic supports reductive dissolution of iron and manganese oxyhydroxides as the mechanism. In addition, groundwater from many wells screened in Sierra Nevada sands contained low concentrations of nitrite or ammonium, indicating reduction of nitrate by denitrification or dissimilatory processes, respectively.

    Geology of the Coast Ranges westward of the study unit strongly affects groundwater quality in the WSJV. Elevated concentrations of TDS, sulfate, boron, selenium and strontium in groundwater were primarily associated with aquifer sediments and recharge derived from areas of the Coast Ranges dominated by Cretaceous-to-Miocene age, organic-rich, reduced marine shales, known as the source of selenium in WSJV soils, surface water, and groundwater. Low sulfur-isotopic values (δ34S) of dissolved sulfate indicate that the sulfate was largely derived from oxidation of biogenic pyrite from the shales, and correlations with trace element concentrations, geologic setting, and groundwater geochemical modeling indicated that distributions of sulfate, strontium, and selenium in groundwater were controlled by dissolution of secondary sulfate minerals in soils and sediments.

    Elevated concentrations of chromium(VI) were primarily associated with aquifer sediments and recharge derived from areas of the Coast Ranges dominated by the Franciscan Complex and ultramafic rocks. The Franciscan Complex also has boron-rich, sodium-chloride dominated hydrothermal fluids that contribute to elevated concentrations of boron and TDS.

    Groundwater from wells screened in Coast Ranges alluvium was primarily oxic and relatively alkaline (median pH value of 7.55) in the Delta–Mendota study area, and primarily nitrate-reducing or suboxic and alkaline (median pH value of 8.4) in the Westside study area. Many groundwater samples from those wells have elevated concentrations of arsenic(V), molybdenum, selenium, or chromium(VI), consistent with desorption of metal oxyanions from mineral surfaces under those geochemical conditions.

    High concentrations of benzene were associated with deep wells located in the vicinity of petroleum deposits at the southern end of the Westside study area. Groundwater from these wells had premodern age and anoxic geochemical conditions, and the ratios among concentrations of hydrocarbon constituents were different from ratios found in fuels and combustion products, which is consistent with a geogenic source for the benzene rather than contamination from anthropogenic sources.

    Water stable-isotope compositions, groundwater recharge temperatures, and groundwater ages were used to infer four types of groundwater: (1) groundwater derived from natural recharge of water from major rivers draining the Sierra Nevada; (2) groundwater primarily derived from natural recharge of water from Coast Ranges runoff; (3) groundwater derived from recharge of pumped groundwater applied to the land surface for irrigation; and (4) groundwater derived from recharge during a period of much cooler paleoclimate. Water previously used for irrigation was found both above and below the Corcoran Clay, supporting earlier inferences that this clay member is no longer a robust confining unit.

    Recharge of water used for irrigation has direct and indirect effects on groundwater quality. Elevated nitrate concentrations and detections of herbicides and fumigants in the Delta–Mendota study area generally were associated with greater agricultural land use near the well and with water recharged during the last 60 years. However, the extent of the groundwater resource affected by agricultural sources of nitrate was limited by groundwater redox conditions sufficient to reduce nitrate. The detection frequency of perchlorate in Delta–Mendota groundwater was greater than expected for natural conditions. Perchlorate, nitrate, selenium, and strontium concentrations were correlated with one another and were greater in groundwater inferred to be recharge of previously pumped groundwater used for irrigation. The source of the perchlorate, selenium, and strontium appears to be salts deposited in the soils and sediments of the arid WSJV that are dissolved and flushed into groundwater by the increased amount of recharge caused by irrigation. In the Delta–Mendota study area, the groundwater with elevated concentrations of selenium was found deeper in the aquifer system than it was reported by a previous study 25 years earlier, suggesting that this transient front of groundwater with elevated concentrations of constituents derived from dissolution of soil salts by irrigation recharge is moving down through the aquifer system and is now reaching the depth zone used for public drinking water supply.


              An updated geospatial liquefaction model for global application   
    We present an updated geospatial approach to estimation of earthquake-induced liquefaction from globally available geospatial proxies. Our previous iteration of the geospatial liquefaction model was based on mapped liquefaction surface effects from four earthquakes in Christchurch, New Zealand, and Kobe, Japan, paired with geospatial explanatory variables including slope-derived VS30, compound topographic index, and magnitude-adjusted peak ground acceleration from ShakeMap. The updated geospatial liquefaction model presented herein improves the performance and the generality of the model. The updates include (1) expanding the liquefaction database to 27 earthquake events across 6 countries, (2) addressing the sampling of nonliquefaction for incomplete liquefaction inventories, (3) testing interaction effects between explanatory variables, and (4) overall improving model performance. While we test 14 geospatial proxies for soil density and soil saturation, the most promising geospatial parameters are slope-derived VS30, modeled water table depth, distance to coast, distance to river, distance to closest water body, and precipitation. We found that peak ground velocity (PGV) performs better than peak ground acceleration (PGA) as the shaking intensity parameter. We present two models which offer improved performance over prior models. We evaluate model performance using the area under the curve under the Receiver Operating Characteristic (ROC) curve (AUC) and the Brier score. The best-performing model in a coastal setting uses distance to coast but is problematic for regions away from the coast. The second best model, using PGV, VS30, water table depth, distance to closest water body, and precipitation, performs better in noncoastal regions and thus is the model we recommend for global implementation.
              Thermoelectric power plant water withdrawals and associated attributes for three Federal datasets in the United States, 2010   

    This dataset combines three Federal datasets of thermoelectric, non-industrial, power plant water withdrawals and associated plant information for the United States in 2010, excluding Puerto Rico and the U.S. Virgin Islands. Historically, thermoelectric water withdrawal has been estimated by the Department of Energy's Energy Information Administration (EIA) through surveys of plant operator-reported data, and the Department of Interior's U.S. Geological Survey's (USGS) 5-year water-use reports including data compiled from state water agencies, plant operators, and the EIA. Recently, the USGS developed models for estimating withdrawal at thermoelectric plants to provide independent estimates from plant operator-reported data. The three Federal datasets include plant-level data for 1,349 plants derived from EIA's 2010 Form EIA-860 and Form EIA-923 databases, USGS 2010 compilation-reported data (hereafter referred to as the USGS-compilation dataset), and USGS 2010 model-estimated data (hereafter referred to as the USGS-model dataset). The plant-level USGS-compilation data provided in this dataset were disaggregated from county-level data published in USGS Circular 1405 (Maupin and others, 2014). The USGS-model data and the EIA data presented in this dataset were previously published in USGS Scientific Investigations Report 2014-5184 (Diehl and Harris, 2014). The year 2010 was chosen because it is the most recent year the USGS 5-year compilation report was published and the only year for which the USGS model estimates have been calculated.

    Diehl, T.H., and Harris, M.A., 2014, Withdrawal and consumption of water by thermoelectric power plants in the United States, 2010: U.S. Geological Survey Scientific Investigations Report 2014–5184, 28 p., http://dx.doi.org/10.3133/sir20145184.

    Maupin, M.A., Kenny, J.F., Hutson, S.S., Lovelace, J.K., Barber, N.L., and Linsey, K.S., 2014, Estimated use of water in the United States in 2010: U.S. Geological Survey Circular 1405, 56 p., http://dx.doi.org/10.3133/cir1405.


              Ecological community datasets used to evaluate the presence of trends in ecological communities in selected rivers and streams across the United States, 1992-2012   

    In 1991, the U.S. Geological Survey (USGS) began a study of more than 50 major river basins across the Nation as part of the National Water-Quality Assessment (NAWQA) project of the National Water-Quality Program. One of the major goals of the NAWQA project is to determine how water-quality and ecological conditions change over time. To support that goal, long-term consistent and comparable ecological monitoring has been conducted on streams and rivers throughout the Nation. Fish, invertebrate, and algae data collected as part of the NAWQA program were retrieved from the USGS Aquatic Bioassessment database for use in trend analysis. Ultimately, these data will provide insight into how natural features and human activities have contributed to changes in ecological condition over time in the Nation’s streams and rivers. This USGS data release contains all of the input and output files necessary to reproduce the results of the ecological trend analysis described in the associated U.S. Geological Survey Scientific Investigations Report (http://dx.doi.org/10.3133/sir20175006). Data preparation for input to the model is also fully described in the above mentioned report.


              Empirical models for predicting volumes of sediment deposited by debris flows and sediment-laden floods in the transverse ranges of southern California   
    Debris flows and sediment-laden floods in the Transverse Ranges of southern California pose severe hazards to nearby communities and infrastructure. Frequent wildfires denude hillslopes and increase the likelihood of these hazardous events. Debris-retention basins protect communities and infrastructure from the impacts of debris flows and sediment-laden floods and also provide critical data for volumes of sediment deposited at watershed outlets. In this study, we supplement existing data for the volumes of sediment deposited at watershed outlets with newly acquired data to develop new empirical models for predicting volumes of sediment produced by watersheds located in the Transverse Ranges of southern California. The sediment volume data represent a broad sample of conditions found in Ventura, Los Angeles and San Bernardino Counties, California. The measured volumes of sediment, watershed morphology, distributions of burn severity within each watershed, the time since the most recent fire, triggering storm rainfall conditions, and engineering soil properties were analyzed using multiple linear regressions to develop two models. A “long-term model” was developed for predicting volumes of sediment deposited by both debris flows and floods at various times since the most recent fire from a database of volumes of sediment deposited by a combination of debris flows and sediment-laden floods with no time limit since the most recent fire (n = 344). A subset of this database was used to develop an “emergency assessment model” for predicting volumes of sediment deposited by debris flows within two years of a fire (n = 92). Prior to developing the models, 32 volumes of sediment, and related parameters for watershed morphology, burn severity and rainfall conditions were retained to independently validate the long-term model. Ten of these volumes of sediment were deposited by debris flows within two years of a fire and were used to validate the emergency assessment model. The models were validated by comparing predicted and measured volumes of sediment. These validations were also performed for previously developed models and identify that the models developed here best predict volumes of sediment for burned watersheds in comparison to previously developed models.
              Database of Trump Administration Officials' Personal Finances Grows   
    Trump administration

    It allows anyone to easily understand the wealth, assets and business interests of many of the people working for President Trump. These include Senate-confirmed appointees, White House aides and members of so-called “beachhead teams.”


              Systems Analyst (Teradata Database Developer) - McCain Foods (Canada) - Florenceville, NB   
    McCain Foods is seeking a Systems Analyst, specialized in Teradata database development, to contribute to the success of our Enterprise Data Warehouse (EDW)...
    From McCain - Tue, 04 Apr 2017 12:46:27 GMT - View all Florenceville, NB jobs
              Recreating deleted webform with same name results in error   

    Problem/Motivation

    After updating to 8.x-5.0-beta14 we had a problem with our automated tests. It creates a webform, and then deletes it. In the second run it created the webform with the same name as the first run, however, the second run the message that the webform was successfully created did not appear. The webform turned out to be created though. This is manually reproducible: I create a webform called 'henri'. delete it. and the second attempt to create results in a error in the logs.

    Drupal\Core\Database\IntegrityConstraintViolationException: SQLSTATE[23000]: Integrity constraint violation: 1062 Duplicate entry 'henri' for key 'PRIMARY': INSERT INTO {webform} (webform_id, next_serial) VALUES (:db_insert_placeholder_0, :db_insert_placeholder_1); Array ( [:db_insert_placeholder_0] => henri [:db_insert_placeholder_1] => 1 ) in Drupal\Core\Database\Connection->handleQueryException() (regel 682 van /var/www/natarch/core/lib/Drupal/Core/Database/Connection.php).

    It turns out that indeed deleting a webform does not remove the entry from the new 'webform' table, but does try to create it again if it already exists.

    SQL confirms this (this was after deleting the webform)

    select * from webform where webform_id = 'henri';
    +------------+-------------+
    | webform_id | next_serial |
    +------------+-------------+
    | henri | 1 |
    +------------+-------------+

    Proposed resolution

    Either delete the entry from the webform table when a form is deleted, or do not recreate it when the entry already exists.


              Senior Operations DBA   
    CA-Salinas, RESPONSIBILITIES: Kforce has an immediate need for a Senior Database Administrator for one of our premier clients located in the Salinas, California (CA) area. The ideal candidate would have strong Operations DBA experience working with MS SQL 2012 or later versions. REQUIREMENTS: Prefer experience MS SQL technologies including SSAS, SSIS, SSRS, Prefer previous experience with clustered database e
              Systems Analyst (Teradata Database Developer) - McCain Foods (Canada) - Florenceville, NB   
    Database development activities will include eliciting data requirements, source data analysis, design of ELT solutions, load and query performance tuning, data...
    From McCain - Tue, 04 Apr 2017 12:46:27 GMT - View all Florenceville, NB jobs
              How to Hack an Election in 7 Minutes - POLITICO Magazine   
    When Princeton professor Andrew Appel decided to hack into a voting machine, he didn’t try to mimic the Russian attackers who hacked into the Democratic National Committee's database last month. He didn’t write malicious code, or linger near a polling place where the  …
              Weis Ad Scan 07/06/17-07/12/17   
    Weis Ad Scan 07/06/17-07/12/17 Here is the NEW Weis Ad Scan for the sales coming up starting 07/06/17 and running thru 07/12/17 Search for matching coupons for these advertised deals using our coupon database HERE. Take a look at the FULL ad scan and be sure to let us know what your favorite deals are in the comments […].

    Brought to you by: Frugal Focus
              Systems Analyst (Teradata Database Developer) - McCain Foods (Canada) - Florenceville, NB   
    McCain Foods is seeking a Systems Analyst, specialized in Teradata database development, to contribute to the success of our Enterprise Data Warehouse (EDW)...
    From McCain - Tue, 04 Apr 2017 12:46:27 GMT - View all Florenceville, NB jobs
              Marketing Manager - DAP Products Inc. - Ontario   
    This includes development and management of a database warehouse of consumer focused and retail sales fact based data....
    From DAP Products Inc. - Wed, 07 Jun 2017 00:02:06 GMT - View all Ontario jobs
              Sourcing Recruiter – Technology -    
    Booking.com BV (the company behind Booking.com™, the market leading online hotel reservation service in the world) and/or its various support companies throughout the world are looking for people to support the business in the fast-growing hotel markets. Booking.com is looking for a Sourcing Technology Recruiter to join our IT recruitment team in Amsterdam . The ideal candidate should be ready to hit the ground running in our fast-paced technical environment with strong interpersonal skills that allow you to develop lasting relationships with both candidates and hiring managers.  As a Sourcing Technology Recruiter you will play a significant role in attracting the best talents around the world and lead the recruiting process for technical and non-technical roles within the IT Department. This is a great opportunity for a driven, energetic, and highly motivated individual that is ready to join a successful global recruitment team. You should be a self-starter, hardworking, flexible, goal-oriented, and energized by constantly meeting new people.  You must be able to multi-task, have strong candidate management skills, and possess the ability to prioritize responsibilities. This role will challenge your creativity, analytical skills and persuasiveness. Whether you use brilliant Boolean logic or have fantastic phone sourcing skills, above all you’re motivated by reaching your goal: finding and attracting great people!  Once you find these key players, you will engage with them to learn about their aspirations and interests. B.Responsible • Developing and maintaining strong business relationships with hiring managers and senior management within the IT Department • Managing full cycle recruitment for roles within the IT Department while ensuring a smooth and positive candidate experience • Uncover talent utilizing Boolean search, database scrubbing, networking, relationship building, event lists, social media, sourcing tools, predictive sourcing tactics and industry research. • Develop innovative sourcing strategies and manage recruiting campaigns and job postings. • Build talent maps of competitive companies through cold calling into companies and/or internet research. • Phone screen candidates for availability, interest level, hard skills, salary range, and basic qualifications. • Able to develop and track metrics of all sourcing activities. • Managing full cycle recruitment for roles within the IT Department while ensuring a smooth and positive candidate experience • Negotiating job offers by partnering with hiring manager, senior management and other stakeholders as necessary.
              Mapping out OCI 7.3 DWH GUIs: Part 1: Insight Plan DWH Portal   
    In this post we explore the OnCommand Insight 7.3 Insight Plan Data Warehouse Portal main page.

    An installation of OnCommand Insight 7.3 Data Warehouse Server with default settings configures these ports:

    80: Insight Plan DWH Portal Port (HTTP)
    443: Insight Plan DWH Portal Port (HTTPS)
    3306: Internal Database Port (SQL)
    9300: Reporting Engine Port

    Insight Plan Data Warehouse Portal

    Access via a web browser to:

    https://FQDN_of_OCI_DWH_SERVER/dwh

    Web UI Main Buttons

    Image: OnCommand Insight 7.3 Data Warehouse Web UI Main Buttons (click to view larger)

    Main Menu

     Connectors  -->
     List Connectors to OnCommand Insight Servers / Create new / Test / Remove

     Jobs  -->
     List Jobs {ID, Name, Status, Start, End} / Clear History

     Schedule  -->
     Build Schedule (or Build now) / Backup Schedule / Custom Script

     Annotations  -->
     List of Annotations {Annotation, Column Name, Target Object, Published} / Edit

     Email Notification  -->
     Job Status Notification Configuration {SMTP configuration, Sender, Recipient ...} / Test

     System Information  -->
     System (Edit Site Name) / Licenses / Upgrade History


     Build from history  -->
     ‘Build From History’ history {Target time, Start running, Status} / Configure / Run

     Reset DWH  -->
     Reset DWH Database / Reset Inventory Only / Reset Reporting Content

     Backup/Restore  -->
     Backup Database and Reports / Restore Database and(or) Reports

     Troubleshooting  -->
     OnCommand Insight ASUP / Annotation Consolidation / Sanity Tests

     User Management  -->
     LDAP Configuration / Add New User / Configure DWH User

    Top Right Corner

    Launch Reporting Portal

    Help
    - Documentation
    - EULA

    Logged in as XXX
    - Logout


              New Semantic Publishing Benchmark Record   

    There is a new SPB (Semantic Publishing Benchmark) 256 Mtriple record with Virtuoso.

    As before, the result has been measured with the feature/analytics branch of the v7fasttrack open source distribution, and it will soon be available as a preconfigured Amazon EC2 image. The updated benchmarks AMI with this version of the software will be out there within the next week, to be announced on this blog.

    On the Cost of RDF Query Optimization

    RDF query optimization is harder than the relational equivalent; first, because there are more joins, hence an NP complete explosion of plan search space, and second, because cardinality estimation is harder and usually less reliable. The work on characteristic sets, pioneered by Thomas Neumann in RDF3X, uses regularities in structure for treating properties usually occurring in the same subject as columns of a table. The same idea is applied for tuning physical representation in the joint Virtuoso / MonetDB work published at WWW 2015.

    The Virtuoso results discussed here, however, are all based on a single RDF quad table with Virtuoso's default index configuration.

    Introducing query plan caching raises the Virtuoso score from 80 qps to 144 qps at the 256 Mtriple scale. The SPB queries are not extremely complex; lookups with many more triple patterns exist in actual workloads, e.g., Open PHACTS. In such applications, query optimization indeed dominates execution times. In SPB, data volumes touched by queries grow near linearly with data scale. At the 256 Mtriple scale, nearly half of CPU cycles are spent deciding a query plan. Below are the CPU cycles for execution and compilation per query type, sorted by descending sum of the times, scaled to milliseconds per execution. These are taken from a one minute sample of running at full throughput.

    Test system is the same used before in the TPC-H series: dual Xeon E5-2630 Sandy Bridge, 2 x 6 cores x 2 threads, 2.3GHz, 192 GB RAM.

    We measure the compile and execute times, with and without using hash join. When considering hash join, the throughput is 80 qps. When not considering hash join, the throughput is 110 qps. With query plan caching, the throughput is 145 qps whether or not hash join is considered. Using hash join is not significant for the workload but considering its use in query optimization leads to significant extra work.

    With hash join

    Compile Execute Total Query
    3156 ms 1181 ms 4337 ms Total
    1327 ms 28 ms 1355 ms query 01
    444 ms 460 ms 904 ms query 08
    466 ms 54 ms 520 ms query 06
    123 ms 268 ms 391 ms query 05
    257 ms 5 ms 262 ms query 11
    191 ms 59 ms 250 ms query 10
    9 ms 179 ms 188 ms query 04
    114 ms 26 ms 140 ms query 07
    46 ms 62 ms 108 ms query 09
    71 ms 25 ms 96 ms query 12
    61 ms 13 ms 74 ms query 03
    47 ms 2 ms 49 ms query 02
           

    Without hash join

    Compile Execute Total Query
    1816 ms 1019 ms 2835 ms Total
    197 ms 466 ms 663 ms query 08
    609 ms 32 ms 641 ms query 01
    188 ms 293 ms 481 ms query 05
    275 ms 61 ms 336 ms query 09
    163 ms 10 ms 173 ms query 03
    128 ms 38 ms 166 ms query 10
    102 ms 5 ms 107 ms query 11
    63 ms 27 ms 90 ms query 12
    24 ms 57 ms 81 ms query 06
    47 ms 1 ms 48 ms query 02
    15 ms 24 ms 39 ms query 07
    5 ms 5 ms 10 ms query 04

    Considering hash join always slows down compilation, and sometimes improves and sometimes worsens execution. Some improvement in cost-model and plan-space traversal-order is possible, but altogether removing compilation via caching is better still. The results are as expected, since a lookup workload such as SPB has little use for hash join by nature.

    The rationale for considering hash join in the first place is that analytical workloads rely heavily on this. A good TPC-H score is simply unfeasible without this as previously discussed on this blog. If RDF is to be a serious contender beyond serving lookups, then hash join is indispensable. The decision for using this however depends on accurate cardinality estimates on either side of the join.

    Previous work (e.g., papers from FORTH around MonetDB) advocates doing away with a cost model altogether, since one is hard and unreliable with RDF anyway. The idea is not without its attraction but will lead to missing out of analytics or to relying on query hints for hash join.

    The present Virtuoso thinking is that going to rule based optimization is not the preferred solution, but rather using characteristic sets for reducing triples into wider tables, which also cuts down on plan search space and increases reliability of cost estimation.

    When looking at execution alone, we see that actual database operations are low in the profile, with memory management taking the top 19%. This is due to CONSTRUCT queries allocating small blocks for returning graphs, which is entirely avoidable.


              Vectored Execution in Column/Row Stores   

    This article discusses the relationship between vectored execution and column- and row-wise data representations. Column stores are traditionally considered to be good for big scans but poor at indexed access. This is not necessarily so, though. We take TPC-H Q9 as a starting point, working with different row- and column-wise data representations and index choices. The goal of the article is to provide a primer on the performance implications of different physical designs.

    All the experiments are against the TPC-H 100G dataset hosted in Virtuoso on the test system used before in the TPC-H series: dual Xeon E5-2630, 2x6 cores x 2 threads, 2.3GHz, 192 GB RAM. The Virtuoso version corresponds to the feature/analytics branch in the v7fasttrack github project. All run times are from memory, and queries generally run at full platform, 24 concurrent threads.

    We note that RDF stores and graph databases usually do not have secondary indices with multiple key parts. However, these do predominantly index-based access as opposed to big scans and hash joins. To explore the impact of this, we have decomposed the tables into projections with a single dependent column, which approximates a triple store or a vertically-decomposed graph database like Sparksee.

    So, in these experiments, we store the relevant data four times over, as follows:

    • 100G TPC-H dataset in the column-wise schema as discussed in the TPC-H series, now complemented with indices on l_partkey and on l_partkey, l_suppkey

    • The same in row-wise data representation

    • Column-wise tables with a single dependent column for l_partkey, l_suppkey, l_extendedprice, l_quantity, l_discount, ps_supplycost, s_nationkey, p_name. These all have the original tables primary key, e.g., l_orderkey, l_linenumber for the l_ prefixed tables

    • The same with row-wise tables

    The column-wise structures are in the DB qualifier, and the row-wise are in the R qualifier. There is a summary of space consumption at the end of the article. This is relevant for scalability, since even if row-wise structures can be faster for scattered random access, they will fit less data in RAM, typically 2 to 3x less. Thus, if "faster" rows cause the working set not to fit, "slower" columns will still win.

    As a starting point, we know that the best Q9 is the one in the Virtuoso TPC-H implementation which is described in Part 10 of the TPC-H blog series. This is a scan of lineitem with a selective hash join followed ordered index access of orders, then hash joins against the smaller tables. There are special tricks to keep the hash tables small by propagating restrictions from the probe side to the build side.

    The query texts are available here, along with the table declarations and scripts for populating the single-column projections. rs.sql makes the tables and indices, rsload.sql copies the data from the TPC-H tables.

    The business question is to calculate the profit from sale of selected parts grouped by year and country of the supplier. This touches most of the tables, aggregates over 1/17 of all sales, and touches at least every page of the tables concerned, if not every row.

    SELECT
                                                                             n_name  AS  nation, 
                                                     EXTRACT(year FROM o_orderdate)  AS  o_year,
              SUM (l_extendedprice * (1 - l_discount) - ps_supplycost * l_quantity)  AS  sum_profit
        FROM  lineitem, part, partsupp, orders, supplier, nation
       WHERE    s_suppkey = l_suppkey
         AND   ps_suppkey = l_suppkey
         AND   ps_partkey = l_partkey
         AND    p_partkey = l_partkey
         AND   o_orderkey = l_orderkey
         AND  s_nationkey = n_nationkey
         AND  p_name LIKE '%green%'
    GROUP BY  nation, o_year
    ORDER BY  nation, o_year DESC
    

    Query Variants

    The query variants discussed here are:

    1. Hash based, the best plan -- 9h.sql

    2. Index based with multicolumn rows, with lineitem index on l_partkey -- 9i.sql, 9ir.sql

    3. Index based with multicolumn rows, lineitem index on l_partkey, l_suppkey -- 9ip.sql, 9ipr.sql

    4. Index based with one table per dependent column, index on l_partkey -- 9p.sql

    5. index based with one table per dependent column, with materialized l_partkey, l_suppkey -> l_orderkey, l_minenumber -- 9pp.sql, 9ppr.sql

    These are done against row- and column-wise data representations with 3 different vectorization settings. The dynamic vector size starts at 10,000 values in a vector, and adaptively upgrades this to 1,000,000 if it finds that index access is too sparse. Accessing rows close to each other is more efficient than widely scattered rows in vectored index access, so using a larger vector will likely cause a denser, hence more efficient, access pattern.

    The 10K vector size corresponds to running with a fixed vector size. The Vector 1 sets vector size to 1, effectively running a tuple at a time, which corresponds to a non-vectorized engine.

    We note that lineitem and its single column projections contain 600M rows. So, a vector of 10K values will hit, on the average, every 60,000th row. A vector of 1,000,000 will thus hit every 600th. This is when doing random lookups that are in no specific order, e.g., getting lineitems by a secondary index on l_partkey.

    1 — Hash-based plan

    Vector Dynamic 10k 1
    Column-wise 4.1 s 4.1 s 145   s
    Row-wise 25.6 s 25.9 s 45.4 s

    Dynamic vector size has no effect here, as there is no indexed access that would gain from more locality. The column store is much faster because of less memory access (just scan the l_partkey column, and filter this with a Bloom filter; and then hash table lookup to pick only items with the desired part). The other columns are accessed only for the matching rows. The hash lookup is vectored since there are hundreds of compressed l_partkey values available at each time. The row store does the hash lookup row by row, hence losing cache locality and instruction-level parallelism.

    Without vectorization, we have a situation where the lineitem scan emits one row at a time. Restarting the scan with the column store takes much longer, since 5 buffers have to be located and pinned instead of one for the row store. The row store is thus slowed down less, but it too suffers almost a factor of 2 from interpretation overhead.

    2 — Index-based, lineitem indexed on l_partkey

    Vector Dynamic 10k 1
    Column-wise 30.4 s 62.3 s 321   s
    Row-wise 31.8 s 27.7 s 122   s

    Here the plan scans part, then partsupp, which shares ordering with part; both are ordered on partkey. Then lineitem is fetched by a secondary index on l_partkey. This produces l_orderkey, l_lineitem, which are used to get the l_suppkey. We then check if the l_suppkey matches the ps_suppkey from partsupp, which drops 3/4 of the rows. The next join is on orders, which shares ordering with lineitem; both are ordered on orderkey.

    There is a narrow win for columns with dynamic vector size. When access becomes scattered, rows win by 2.5x, because there is only one page to access instead of 1 + 3 for columns. This is compensated for if the next item is found on the same page, which happens if the access pattern is denser.

    3 — Index-based, lineitem indexed on L_partkey, l_suppkey

    Vector Dynamic 10k 1
    Column-wise 16.9 s 47.2 s 151   s
    Row-wise 22.4 s 20.7 s 89   s

    This is similar to the previous, except that now only lineitems that match ps_partkey, ps_suppkey are accessed, as the secondary index has two columns. Access is more local. Columns thus win more with dynamic vector size.

    4 — Decomposed, index on l_partkey

    Vector Dynamic 10k 1
    Column-wise 35.7 s 170   s 601   s
    Row-wise 44.5 s 56.2 s 130   s

    Now, each of the l_extendedprice, l_discount, l_quantity and l_suppkey is a separate index lookup. The times are slightly higher but the dynamic is the same.

    The non-vectored columns case is hit the hardest.

    5 — Decomposed, index on l_partkey, l_suppkey

    Vector Dynamic 10k 1
    Column-wise 19.6 s 111   s 257   s
    Row-wise 32.0 s 37   s 74.9 s

    Again, we see the same dynamic as with a multicolumn table. Columns win slightly more at long vector sizes because of overall better index performance in the presence of locality.

    Space Utilization

    The following tables list the space consumption in megabytes of allocated pages. Unallocated space in database files is not counted.

    The row-wise table also contains entries for column-wise structures (DB.*) since these have a row-wise sparse index. The size of this is however negligible, under 1% of the column-wise structures.

    Row-Wise    Column-Wise
    MB structure
    73515 R.DBA.LINEITEM
    14768 R.DBA.ORDERS
    11728 R.DBA.PARTSUPP
    10161 r_lpk_pk
    10003 r_l_pksk
    9908 R.DBA.l_partkey
    8761 R.DBA.l_extendedprice
    8745 R.DBA.l_discount
    8738 r_l_pk
    8713 R.DBA.l_suppkey
    6267 R.DBA.l_quantity
    2223 R.DBA.CUSTOMER
    2180 R.DBA.o_orderdate
    2041 r_O_CK
    1911 R.DBA.PART
    1281 R.DBA.ps_supplycost
    811 R.DBA.p_name
    127 R.DBA.SUPPLIER
    88 DB.DBA.LINEITEM
    24 DB.DBA.ORDERS
    11 DB.DBA.PARTSUPP
    9 R.DBA.s_nationkey
    5 l_pksk
    4 DB.DBA.l_partkey
    4 lpk_pk
    4 DB.DBA.l_extendedprice
    3 l_pk
    3 DB.DBA.l_suppkey
    2 DB.DBA.CUSTOMER
    2 DB.DBA.l_quantity
    1 DB.DBA.PART
    1 O_CK
    1 DB.DBA.l_discount
      
    MB structure
    36482 DB.DBA.LINEITEM
    13087 DB.DBA.ORDERS
    11587 DB.DBA.PARTSUPP
    5181 DB.DBA.l_extendedprice
    4431 l_pksk
    3072 DB.DBA.l_partkey
    2958 lpk_pk
    2918 l_pk
    2835 DB.DBA.l_suppkey
    2067 DB.DBA.CUSTOMER
    1618 DB.DBA.PART
    1156 DB.DBA.l_quantity
    961 DB.DBA.ps_supplycost
    814 O_CK
    798 DB.DBA.l_discount
    724 DB.DBA.p_name
    436 DB.DBA.o_orderdate
    126 DB.DBA.SUPPLIER
    1 DB.DBA.s_nationkey

    In both cases, the large tables are on top, but the column-wise case takes only half the space due to compression.

    We note that the single column projections are smaller column-wise. The l_extendedprice is not very compressible hence column-wise takes much more space than l_quantity; the row-wise difference is less. Since the leading key parts l_orderkey, l_linenumber are ordered and very compressible, the column-wise structures are in all cases noticeably more compact.

    The same applies to the multipart index l_pksk and r_l_pksk (l_partkey, l_suppkey, l_orderkey, l_linenumber) in column- and row-wise representations.

    Note that STRING columns (e.g., l_comment) are not compressed. If they were, the overall space ratio would be even more to the advantage of the column store.

    Conclusions

    Column stores and vectorization inextricably belong together. Column-wise compression yields great gains also for indices, since sorted data is easy to compress. Also for non-sorted data, adaptive use of dictionaries, run lengths, etc., produce great space savings. Columns also win with indexed access if there is locality.

    Row stores have less dependence on locality, but they also will win by a factor of 3 from dropping interpretation overhead and exploiting join locality.

    For point lookups, columns lose by 2+x but considering their better space efficiency, they will still win if space savings prevent going to secondary storage. For bulk random access, like in graph analytics, columns will win because of being able to operate on a large vector of keys to fetch.

    For many workloads, from TPC-H to LDBC social network, multi-part keys are a necessary component of physical design for performance if indexed access predominates. Triple stores and most graph databases do not have such and are therefore at a disadvantage. Self-joining, like in RDF or other vertically decomposed structures, can cost up to a factor of 10-20 over a column-wise multicolumn table. This depends however on the density of access.

    For analytical workloads, where the dominant join pattern is the scan with selective hash join, column stores are unbeatable, as per common wisdom. There are good physical reasons for this and the row store even with well implemented vectorization loses by a factor of 5.

    For decomposed structures, like RDF quads or single column projections of tables, column stores are relatively more advantageous because the key columns are extensively repeated, and these compress better with columns than with rows. In all the RDF workloads we have tried, columns never lose, but there is often a draw between rows and columns for lookup workloads. The longer the query, the more columns win.


              Big Data, Part 1: Virtuoso Meets Hive   

    In this series, we will look at Virtuoso and some of the big data technologies out there. SQL on Hadoop is of interest, as well as NoSQL technologies.

    We begin at the beginning, with Hive, the grand-daddy of SQL on Hadoop.

    The test platform is two Amazon R3.8 AMI instances. We compared Hive with the Virtuoso 100G TPC-H experiment on the same platform, published earlier on this blog. The runs follow a bulk load in both cases, with all data served from memory. The platform has 2x244GB RAM with only 40GB or so of working set.

    The Virtuoso version and settings are as in the Virtuoso Cluster test AMI.

    The Hive version is 0.14 from the Hortonworks HDP 2.2 distribution>. The Hive schema and query formulations are the ones from hive-testbench on GitHub. The Hive configuration parameters are as set by Ambari 2.0.1. These are different from the ones in hive-testbench, but the Ambari choices offer higher performance on the platform. We did run statistics with Hive and did not specify any settings not in the hive-testbench. Thus we suppose the query plans were as good as Hive will make them. Platform utilization was even across both machines, and varied between 30% and 100% of the 2 x 32 hardware threads.

    Load time with Hive was 742 seconds against 232 seconds with Virtuoso. In both cases, this was a copy from 32 CSV files into native database format; for Hive, this is ORC (Optimized Row Columnar). In Virtuoso, there is one index, (o_custkey); in Hive, there are no indices.

    Query Virtuoso Hive Notes
    332     s 742     s Data Load
    Q1 1.098 s 296.636 s
    Q2 0.187 s >3600     s Hive Timeout
    Q3 0.761 s 98.652 s
    Q4 0.205 s 147.867 s
    Q5 0.808 s 114.782 s
    Q6 2.403 s 71.789 s
    Q7 0.59  s 394.201 s
    Q8 0.775 s >3600     s Hive Timeout
    Q9 1.836 s >3600     s Hive Timeout
    Q10 3.165 s 179.646 s
    Q11 1.37  s 43.094 s
    Q12 0.356 s 101.193 s
    Q13 2.233 s 208.476 s
    Q14 0.488 s 89.047 s
    Q15 0.72 s 136.431 s
    Q16 0.814 s 105.652 s
    Q17 0.681 s 255.848 s
    Q18 1.324 s 337.921 s
    Q19 0.417 s >3600     s Hive Timeout
    Q20 0.792 s 193.965 s
    Q21 0.720 s 670.718 s
    Q22 0.155 s 68.462 s

    Hive does relatively best on bulk load. This is understandable since this is a sequential read of many files in parallel with just compression to do.

    Hive's query times are obviously affected by not having a persistent memory image of the data, as this is always streamed from the storage files into other files as MapReduce intermediate results. This seems to be an operator-at-a-time business as opposed to Virtuoso's vectorized streaming.

    The queries that would do partitioned hash joins (e.g., Q9) did not finish under an hour in Hive, so we do not have a good metric of a cross-partition hash join.

    One could argue that one should benchmark Hive only in disk-bound circumstances. We may yet get to this.

    Our next stop will probably be Impala, which ought to do much better than Hive, as it dose not have the MapReduce overheads.

    If you are a Hive expert and believe that Hive should have done much better, please let us know how to improve the Hive scores, and we will retry.


              Rethink Big and Europe?s Position in Big Data   

    I will here take a break from core database and talk a bit about EU policies for research funding.

    I had lunch with Stefan Manegold of CWI last week, where we talked about where European research should go. Stefan is involved in RETHINK big, a European research project for compiling policy advice regarding big data for EC funding agencies. As part of this, he is interviewing various stakeholders such as end user organizations and developers of technology.

    RETHINK big wants to come up with a research agenda primarily for hardware, anything from faster networks to greener data centers. CWI represents software expertise in the consortium.

    So, we went through a regular questionnaire about how we see the landscape. I will summarize this below, as this is anyway informative.

    Core competence

    My own core competence is in core database functionality, specifically in high performance query processing, scale-out, and managing schema-less data. Most of the Virtuoso installed base is in the RDF space, but most potential applications are in fact outside of this niche.

    User challenges

    The life sciences vertical is the one in which I have the most application insight, from going to Open PHACTS meetings and holding extensive conversations with domain specialists. We have users in many other verticals, from manufacturing to financial services, but there I do not have as much exposure to the actual applications.

    Having said this, the challenges throughout tend to be in diversity of data. Every researcher has their MySQL database or spreadsheet, and there may not even be a top level catalogue of everything. Data formats are diverse. Some people use linked data (most commonly RDF) as a top level metadata format. The application data, such as gene sequences or microarray assays, reside in their native file formats and there is little point in RDF-izing these.

    There are also public data resources that are published in RDF serializations as vendor-neutral, self-describing format. Having everything as triples, without a priori schema, makes things easier to integrate and in some cases easier to describe and query.

    So, the challenge is in the labor intensive nature of data integration. Data comes with different levels of quantity and quality, from hand-curated to NLP extractions. Querying in the single- or double-digit terabyte range with RDF is quite possible, as we have shown many times on this blog, but most use cases do not even go that far. Anyway, what we see on the field is primarily a data diversity game. The scenario is data integration; the technology we provide is database. The data transformation proper, data cleansing, units of measure, entity de-duplication, and such core data-integration functions are performed using diverse, user-specific means.

    Jerven Bolleman of the Swiss Institute of Bioinformatics is a user of ours with whom we have long standing discussions on the virtues of federated data and querying. I advised Stefan to go talk to him; he has fresh views about the volume challenges with unexpected usage patterns. Designing for performance is tough if the usage pattern is out of the blue, like correlating air humidity on the day of measurement with the presence of some genomic patterns. Building a warehouse just for that might not be the preferred choice, so the problem field is not exhausted. Generally, I’d go for warehousing though.

    What technology would you like to have? Network or power efficiency?

    OK. Even a fast network is a network. A set of processes on a single shared-memory box is also a kind of network. InfiniBand is maybe half the throughput and 3x the latency of single threaded interprocess communication within one box. The operative word is latency. Making large systems always involves a network or something very much like one in large scale-up scenarios.

    On the software side, next to nobody understands latency and contention; yet these are the one core factor in any pursuit of scalability. Because of this situation, paradigms like MapReduce and bulk synchronous parallel (BSP) processing have become popular because these take the communication out of the program flow, so the programmer cannot muck this up, as otherwise would happen with the inevitability of destiny. Of course, our beloved SQL or declarative query in general does give scalability in many tasks without programmer participation. Datalog has also been used as a means of shipping computation around, as in the the work of Hellerstein.

    There are no easy solutions. We have built scale-out conscious, vectorized extensions to SQL procedures where one can express complex parallel, distributed flows, but people do not use or understand these. These are very useful, even indispensable, but only on the inside, not as a programmer-facing construct. MapReduce and BSP are the limit of what a development culture will absorb. MapReduce and BSP do not hide the fact of distributed processing. What about things that do? Parallel, partitioned extensions to Fortran arrays? Functional languages? I think that all the obvious aids to parallel/distributed programming have been conceived of. No silver bullet; just hard work. And above all the discernment of what paradigm fits what problem. Since these are always changing, there is no finite set of rules, and no substitute for understanding and insight, and the latter are vanishingly scarce. "Paradigmatism," i.e., the belief that one particular programming model is a panacea outside of its original niche, is a common source of complexity and inefficiency. This is a common form of enthusiastic naïveté.

    If you look at power efficiency, the clusters that are the easiest to program consist of relatively few high power machines and a fast network. A typical node size is 16+ cores and 256G or more RAM. Amazon has these in entirely workable configurations, as documented earlier on this blog. The leading edge in power efficiency is in larger number of smaller units, which makes life again harder. This exacerbates latency and forces one to partition the data more often, whereas one can play with replication of key parts of data more freely if the node size is larger.

    One very specific item where research might help without having to rebuild the hardware stack would be better, lower-latency exposure of networks to software. Lightweight threads and user-space access, bypassing slow protocol stacks, etc. MPI has some of this, but maybe more could be done.

    So, I will take a cluster of such 16-core, 256GB machines on a faster network, over a cluster of 1024 x 4G mobile phones connected via USB. Very selfish and unecological, but one has to stay alive and life is tough enough as is.

    Are there pressures to adapt business models based on big data?

    The transition from capex to opex may be approaching maturity, as there have been workable cloud configurations for the past couple of years. The EC2 from way back, with at best a 4 core 16G VM and a horrible network for $2/hr, is long gone. It remains the case that 4 months of 24x7 rent in the cloud equals the purchase price of physical hardware. So, for this to be economical long-term at scale, the average utilization should be about 10% of the peak, and peaks should not be on for more than 10% of the time.

    So, database software should be rented by the hour. A 100-150% markup for the $2.80 a large EC2 instance costs would be reasonable. Consider that 70% of the cost in TPC benchmarks is database software.

    There will be different pricing models combining different up-front and per-usage costs, just as there are for clouds now. If the platform business goes that way and the market accepts this, then systems software will follow. Price/performance quotes should probably be expressed as speed/price/hour instead of speed/price.

    The above is rather uncontroversial but there is no harm restating these facts. Reinforce often.

    Well, the question is raised, what should Europe do that would have tangible impact in the next 5 years?

    This is a harder question. There is some European business in wide area and mobile infrastructures. Competing against Huawei will keep them busy. Intel and Mellanox will continue making faster networks regardless of European policies. Intel will continue building denser compute nodes, e.g., integrated Knight’s Corner with dual IB network and 16G fast RAM on chip. Clouds will continue making these available on demand once the technology is in mass production.

    What’s the next big innovation? Neuromorphic computing? Quantum computing? Maybe. For now, I’d just do more engineering along the core competence discussed above, with emphasis on good marketing and scalable execution. By this I mean trained people who know something about deployment. There is a huge training gap. In the would-be "Age of Data," knowledge of how things actually work and scale is near-absent. I have offered to do some courses on this to partners and public alike, but I need somebody to drive this show; I have other things to do.

    I have been to many, many project review meetings, mostly as a project partner but also as reviewer. For the past year, the EC has used an innovation questionnaire at the end of the meetings. It is quite vague, and I don’t think it delivers much actionable intelligence.

    What would deliver this would be a venture capital type activity, with well-developed networks and active participation in developing a business. The EC is not now set up to perform this role, though. But the EC is a fairly large and wealthy entity, so it could invest some money via this type of channel. Also there should be higher individual incentives and rewards for speed and excellence. Getting the next Horizon 2020 research grant may be good, but better exists. The grants are competitive enough and the calls are not bad; they follow the times.

    In the projects I have seen, productization does get some attention, e.g., the LOD2 stack, but it is not something that is really ongoing or with dedicated commercial backing. It may also be that there is no market to justify such dedicated backing. Much of the RDF work has been "me, too" — let’s do what the real database and data integration people do, but let’s just do this with triples. Innovation? Well, I took the best of the real DB world and adapted this to RDF, which did produce a competent piece of work with broad applicability, extending outside RDF. Is there better than this? Well, some of the data integration work (e.g., LIMES) is not bad, and it might be picked up by some of the players that do this sort of thing in the broader world, e.g., Informatica, the DI suites of big DB vendors, Tamr, etc. I would not know if this in fact adds value to the non-RDF equivalents; I do not know the field well enough, but there could be a possibility.

    The recent emphasis for benchmarking, spearheaded by Stefano Bertolo is good, as exemplified by the LDBC FP7. There should probably be one or two projects of this sort going at all times. These make challenges known and are an effective means of guiding research, with a large multiplier: Once a benchmark gets adopted, infinitely more work goes into solving the problem than in stating it in the first place.

    The aims and calls are good. The execution by projects is variable. For 1% of excellence, there apparently must be 99% of so-and-so, but this is just a fact of life and not specific to this context. The projects are rather diffuse. There is not a single outcome that gets all the effort. In this, the level of engagement of participants is less and focus is much more scattered than in startups. A really hungry, go-getter mood is mostly absent. I am a believer in core competence. Well, most people will agree that core competence is nice. But the projects I have seen do not drive for it hard enough.

    It is hard to say exactly what kinds of incentives could be offered to encourage truly exceptional work. The American startup scene does offer high rewards and something of this could be transplanted into the EC project world. I would not know exactly what form this could take, though.


              Virtuoso Elastic Cluster Benchmarks AMI on Amazon EC2   

    We have another new Amazon machine image, this time for deploying your own Virtuoso Elastic Cluster on the cloud. The previous post gave a summary of running TPC-H on this image. This post is about what the AMI consists of and how to set it up.

    Note: This AMI is running a pre-release build of Virtuoso 7.5, Commercial Edition. Features are subject to change, and this build is not licensed for any use other than the AMI-based benchmarking described herein.

    There are two preconfigured cluster setups; one is for two (2) machines/instances and one is for four (4). Generation and loading of TPC-H data, as well as the benchmark run itself, is preconfigured, so you can do it by entering just a few commands. The whole sequence of doing a terabyte (1000G) scale TPC-H takes under two hours, with 30 minutes to generate the data, 35 minutes to load, and 35 minutes to do three benchmark runs. The 100G scale is several times faster still.

    To experiment with this AMI, you will need a set of license files, one per machine/instance, which our Sales Team can provide.

    Detailed instructions are on the AMI, in /home/ec2-user/cluster_instructions.txt, but the basic steps to get up and running are as follows:

    1. Instantiate machine image ami-811becea) (AMI ID is subject to change; you should be able to find the latest by searching for "OpenLink Virtuoso Benchmarks" in "Community AMIs"; this one is short-named virtuoso-bench-cl) with two or four (2 or 4) R3.8xlarge instances within one virtual private cluster and placement group. Make sure the VPC security is set to allow all connections.

    2. Log in to the first, and fill in the configuration file with the internal IP addresses of all machines instantiated in step 1.

    3. Distribute the license files to the instances, and start the OpenLink License Manager on each machine.

    4. Run 3 shell commands to set up the file systems and the Virtuoso configuration files.

    5. If you do not plan to run one of these benchmarks, you can simply start and work with the Virtuoso cluster now. It is ready for use with an empty database.

    6. Before running one of these benchmark, generate the appropriate dataset with the dbgen.sh command.

    7. Bulk load the data with load.sh.

    8. Run the benchmark with run.sh.

    Right now the cluster benchmarks are limited to TPC-H but cluster versions of the LDBC Social Network and Semantic Publishing benchmarks will follow soon.


              In Hoc Signo Vinces (part 21 of n): Running TPC-H on Virtuoso Elastic Cluster on Amazon EC2   

    We have made an Amazon EC2 deployment of Virtuoso 7 Commercial Edition, configured to use the Elastic Cluster Module with TPC-H preconfigured, similar to the recently published OpenLink Virtuoso Benchmark AMI running the Open Source Edition. The details of the new Elastic Cluster AMI and steps to use it will be published in a forthcoming post. Here we will simply look at results of running TPC-H 100G scale on two machines, and 1000G scale on four machines. This shows how Virtuoso provides great performance on a cloud platform. The extremely fast bulk load — 33 minutes for a terabyte! — means that you can get straight to work even with on-demand infrastructure.

    In the following, the Amazon instance type is R3.8xlarge, each with dual Xeon E5-2670 v2, 244G RAM, and 2 x 300G SSD. The image is made from the Amazon Linux with built-in network optimization. We first tried a RedHat image without network optimization and had considerable trouble with the interconnect. Using network-optimized Amazon Linux images inside a virtual private cloud has resolved all these problems.

    The network optimized 10GE interconnect at Amazon offers throughput close to the QDR InfiniBand running TCP-IP; thus the Amazon platform is suitable for running cluster databases. The execution that we have seen is not seriously network bound.

    100G on 2 machines, with a total of 32 cores, 64 threads, 488 GB RAM, 4 x 300 GB SSD

    Load time: 3m 52s
    Run Power Throughput Composite
    1 523,554.3 590,692.6 556,111.2
    2 565,353.3 642,503.0 602,694.9

    1000G on 4 machines, with a total of 64 cores, 128 threads, 976 GB RAM, 8 x 300 GB SSD

    Load time: 32m 47s
    Run Power Throughput Composite
    1 592,013.9 754,107.6 668,163.3
    2 896,564.1 828,265.4 861,738.4
    3 883,736.9 829,609.0 856,245.3

    For the larger scale we did 3 sets of power + throughput tests to measure consistency of performance. By the TPC-H rules, the worst (first) score should be reported. Even after bulk load, this is markedly less than the next power score due to working set effects. This is seen to a lesser degree with the first throughput score also.

    The numerical quantities summaries are available in a report.zip file, or individually --

    Subsequent posts will explain how to deploy Virtuoso Elastic Clusters on AWS.

    In Hoc Signo Vinces (TPC-H) Series


              Introducing the OpenLink Virtuoso Benchmarks AMI on Amazon EC2   

    The OpenLink Virtuoso Benchmarks AMI is an Amazon EC2 machine image with the latest Virtuoso open source technology preconfigured to run —

    • TPC-H , the classic of SQL data warehousing

    • LDBC SNB, the new Social Network Benchmark from the Linked Data Benchmark Council

    • LDBC SPB, the RDF/SPARQL Semantic Publishing Benchmark from LDBC

    This package is ideal for technology evaluators and developers interested in getting the most performance out of Virtuoso. This is also an all-in-one solution to any questions about reproducing claimed benchmark results. All necessary tools for building and running are included; thus any developer can use this model installation as a starting point. The benchmark drivers are preconfigured with appropriate settings, and benchmark qualification tests can be run with a single command.

    The Benchmarks AMI includes a precompiled, preconfigured checkout of the v7fasttrack github repository, checkouts of the github repositories of the benchmarks, and a number of running directories with all configuration files preset and optimized. The image is intended to be instantiated on a R3.8xlarge Amazon instance with 244G RAM, dual Xeon E5-2670 v2, and 600G SSD.

    Benchmark datasets and preloaded database files can be downloaded from S3 when large, and generated as needed on the instance when small. As an alternative, the instance is also set up to do all phases of data generation and database bulk load.

    The following benchmark setups are included:

    • TPC-H 100G
    • TPC-H 300G
    • LDBC SNB Validation
    • LDBC SNB Interactive 100G
    • LDBC SNB Interactive 300G (SF3)
    • LDBC SPB Validation
    • LDBC SPB Basic 256 Mtriples (SF5)
    • LDBC SPB Basic 1 Gtriple

    The AMI will be expanded as new benchmarks are introduced, for example, the LDBC Social Network Business Intelligence or Graph Analytics.

    To get started:

    1. Instantiate machine image ami-eb789280 (AMI ID is subject to change; you should be able to find the latest by searching for "OpenLink Virtuoso Benchmarks" in "Community AMIs"; this one is short-named virtuoso-bench-6) with a R3.8xlarge instance.

    2. Connect via ssh.

    3. See the README (also found in the ec2-user's home directory) for full instructions on getting up and running.


              SNB Interactive, Part 3: Choke Points and Initial Run on Virtuoso   

    In this post we will look at running the LDBC SNB on Virtuoso.

    First, let's recap what the benchmark is about:

    1. fairly frequent short updates, with no update contention worth mentioning
    2. short random lookups
    3. medium complex queries centered around a person's social environment

    The updates exist so as to invalidate strategies that rely too heavily on precomputation. The short lookups exist for the sake of realism; after all, an online social application does lookups for the most part. The medium complex queries are to challenge the DBMS.

    The DBMS challenges have to do firstly with query optimization, and secondly with execution with a lot of non-local random access patterns. Query optimization is not a requirement, per se, since imperative implementations are allowed, but we will see that these are no more free of the laws of nature than the declarative ones.

    The workload is arbitrarily parallel, so intra-query parallelization is not particularly useful, if also not harmful. There are latency constraints on operations which strongly encourage implementations to stay within a predictable time envelope regardless of specific query parameters. The parameters are a combination of person and date range, and sometimes tags or countries. The hardest queries have the potential to access all content created by people within 2 steps of a central person, so possibly thousands of people, times 2000 posts per person, times up to 4 tags per post. We are talking in the millions of key lookups, aiming for sub-second single-threaded execution.

    The test system is the same as used in the TPC-H series: dual Xeon E5-2630, 2x6 cores x 2 threads, 2.3GHz, 192 GB RAM. The software is the feature/analytics branch of v7fasttrack, available from www.github.com.

    The dataset is the SNB 300G set, with:

    1,136,127 persons
    125,249,604 knows edges
    847,886,644 posts , including replies
    1,145,893,841 tags of posts or replies
    1,140,226,235 likes of posts or replies

    As an initial step, we run the benchmark as fast as it will go. We use 32 threads on the driver side for 24 hardware threads.

    Below are the numerical quantities for a 400K operation run after 150K operations worth of warmup.

    Duration: 10:41.251
    Throughput: 623.71 (op/s)

    The statistics that matter are detailed below, with operations ranked in order of descending client-side wait-time. All times are in milliseconds.

    % of total total_wait name count mean min max
    20     % 4,231,130 LdbcQuery5 656 6,449.89    245 10,311
    11     % 2,272,954 LdbcQuery8 18,354 123.84    14 2,240
    10     % 2,200,718 LdbcQuery3 388 5,671.95    468 17,368
    7.3   % 1,561,382 LdbcQuery14 1,124 1,389.13    4 5,724
    6.7   % 1,441,575 LdbcQuery12 1,252 1,151.42    15 3,273
    6.5   % 1,396,932 LdbcQuery10 1,252 1,115.76    13 4,743
    5     % 1,064,457 LdbcShortQuery3PersonFriends 46,285 22.9979  0 2,287
    4.9   % 1,047,536 LdbcShortQuery2PersonPosts 46,285 22.6323  0 2,156
    4.1   % 885,102 LdbcQuery6 1,721 514.295   8 5,227
    3.3   % 707,901 LdbcQuery1 2,117 334.389   28 3,467
    2.4   % 521,738 LdbcQuery4 1,530 341.005   49 2,774
    2.1   % 440,197 LdbcShortQuery4MessageContent 46,302 9.50708 0 2,015
    1.9   % 407,450 LdbcUpdate5AddForumMembership 14,338 28.4175  0 2,008
    1.9   % 405,243 LdbcShortQuery7MessageReplies 46,302 8.75217 0 2,112
    1.9   % 404,002 LdbcShortQuery6MessageForum 46,302 8.72537 0 1,968
    1.8   % 387,044 LdbcUpdate3AddCommentLike 12,659 30.5746  0 2,060
    1.7   % 361,290 LdbcShortQuery1PersonProfile 46,285 7.80577 0 2,015
    1.6   % 334,409 LdbcShortQuery5MessageCreator 46,302 7.22234 0 2,055
    1     % 220,740 LdbcQuery2 1,488 148.347   2 2,504
    0.96  % 205,910 LdbcQuery7 1,721 119.646   11 2,295
    0.93  % 198,971 LdbcUpdate2AddPostLike 5,974 33.3062  0 1,987
    0.88  % 189,871 LdbcQuery11 2,294 82.7685  4 2,219
    0.85  % 182,964 LdbcQuery13 2,898 63.1346  1 2,201
    0.74  % 158,188 LdbcQuery9 78 2,028.05    1,108 4,183
    0.67  % 143,457 LdbcUpdate7AddComment 3,986 35.9902  1 1,912
    0.26  % 54,947 LdbcUpdate8AddFriendship 571 96.2294  1 988
    0.2   % 43,451 LdbcUpdate6AddPost 1,386 31.3499  1 2,060
    0.0086% 1,848 LdbcUpdate4AddForum 103 17.9417  1 65
    0.0002% 44 LdbcUpdate1AddPerson 2 22       10 34

    At this point we have in-depth knowledge of the choke points the benchmark stresses, and we can give a first assessment of whether the design meets its objectives for setting an agenda for the coming years of graph database development.

    The implementation is well optimized in general but still has maybe 30% room for improvement. We note that this is based on a compressed column store. One could think that alternative data representations, like in-memory graphs of structs and pointers between them, are better for the task. This is not necessarily so; at the least, a compressed column store is much more space efficient. Space efficiency is the root of cost efficiency, since as soon as the working set is not in memory, a random access workload is badly hit.

    The set of choke points (technical challenges) actually revealed by the benchmark is so far as follows:

    • Cardinality estimation under heavy data skew — Many queries take a tag or a country as a parameter. The cardinalities associated with tags vary from 29M posts for the most common to 1 for the least common. Q6 has a common tag (in top few hundred) half the time and a random, most often very infrequent, one the rest of the time. A declarative implementation must recognize the cardinality implications from the literal and plan accordingly. An imperative one would have to count. Missing this makes Q6 take about 40% of the time instead of 4.1% when adapting.

    • Covering indices — Being able to make multi-column indices that duplicate some columns from the table often saves an entire table lookup. For example, an index on post by author can also contain the post's creation date.

    • Multi-hop graph traversal — Most queries access a two-hop environment starting at a person. Two queries look for shortest paths of unbounded length. For the two-hop case, it makes almost no difference whether this is done as a union or a special graph traversal operator. For shortest paths, this simply must be built into the engine; doing this client-side incurs prohibitive overheads. A bidirectional shortest path operation is a requirement for the benchmark.

    • Top K Most queries returning posts order results by descending date. Once there are at least k results, anything older than the kth can be dropped, adding a date selection as early as possible in the query. This interacts with vectored execution, so that starting with a short vector size more rapidly produces an initial top k.

    • Late projection — Many queries access several columns and touch millions of rows but only return a few. The columns that are not used in sorting or selection can be retrieved only for the rows that are actually returned. This is especially useful with a column store, as this removes many large columns (e.g., text of a post) from the working set.

    • Materialization — Q14 accesses an expensive-to-compute edge weight, the number of post-reply pairs between two people. Keeping this precomputed drops Q14 from the top place. Other materialization would be possible, for example Q2 (top 20 posts by friends), but since Q2 is just 1% of the load, there is no need. One could of course argue that this should be 20x more frequent, in which case there could be a point to this.

    • Concurrency control — Read-write contention is rare, as updates are randomly spread over the database. However, some pages get read very frequently, e.g., some middle level index pages in the post table. Keeping a count of reading threads requires a mutex, and there is significant contention on this. Since the hot set can be one page, adding more mutexes does not always help. However, hash partitioning the index into many independent trees (as in the case of a cluster) helps for this. There is also contention on a mutex for assigning threads to client requests, as there are large numbers of short operations.

    In subsequent posts, we will look at specific queries, what they in fact do, and what their theoretical performance limits would be. In this way we will have a precise understanding of which way SNB can steer the graph DB community.

    SNB Interactive Series


              SNB Interactive, Part 2: Modeling Choices   

    SNB Interactive is the wild frontier, with very few rules. This is necessary, among other reasons, because there is no standard property graph data model, and because the contestants support a broad mix of programming models, ranging from in-process APIs to declarative query.

    In the case of Virtuoso, we have played with SQL and SPARQL implementations. For a fixed schema and well known workload, SQL will always win. The reason is that SQL allows materialization of multi-part indices and data orderings that make sense for the application. In other words, there is transparency into physical design. An RDF/SPARQL-based application may also have physical design by means of structure-aware storage, but this is more complex and here we are just concerned with speed and having things work precisely as we intend.

    Schema Design

    SNB has a regular schema described by a UML diagram. This has a number of relationships, of which some have attributes. There are no heterogenous sets, i.e., no need for run-time typed attributes or graph edges with the same label but heterogenous end-points. Translation into SQL or SPARQL is straightforward. Edges with attributes (e.g., the foaf:knows relation between people) would end up represented as a subject with the end points and the effective date as properties. The relational implementation has a two-part primary key and the effective date as a dependent column. A native property graph database would use an edge with an extra property for this, as such are typically supported.

    The only table-level choice has to do with whether posts and comments are kept in the same or different data structures. The Virtuoso schema uses a single table for both, with nullable columns for the properties that occur only in one. This makes the queries more concise. There are cases where only non-reply posts of a given author are accessed. This is supported by having two author foreign key columns each with its own index. There is a single nullable foreign key from the reply to the post/comment being replied to.

    The workload has some frequent access paths that need to be supported by index. Some queries reward placing extra columns in indices. For example, a common pattern is accessing the most recent posts of an author or a group of authors. There, having a composite key of ps_creatorid, ps_creationdate, ps_postid pays off since the top-k on creationdate can be pushed down into the index without needing a reference to the table.

    The implementation is free to choose data types for attributes, particularly datetimes. The Virtuoso implementation adopts the practice of the Sparksee and Neo4j implementations and represents this is a count of milliseconds since epoch. This is less confusing, faster to compare, and more compact than a native datetime datatype that may or may not have timezones, etc. Using a built-in datetime seems to be nearly always a bad idea. A dimension table or a number for a time dimension avoids the ambiguities of a calendar or at least makes these explicit.

    The benchmark allows procedurally maintained materializations of intermediate results for use by queries as long as these are maintained transaction-by-transaction. For example, each person could have the 20 newest posts by their immediate contacts precomputed. This would reduce Q2 "top of the wall" to a single lookup. This does not however appear to be worthwhile. The Virtuoso implementation does do one such materialization for Q14: A connection weight is calculated for every pair of persons that know each other. This is related to the count of replies by either to content generated by the other. If there does not exist a single reply in either direction, the weight is taken to be 0. This weight is precomputed after bulk load and subsequently maintained each time a reply is added. The table for this is the only row-wise structure in the schema and represents a half-matrix of connected people, i.e., person1, person2 -> weight. Person1 is by convention the one with the smaller p_personid. Note that comparing IDs in this way is useful but not normally supported by SPARQL/RDF systems. SPARQL would end up comparing strings of URIs with disastrous performance implications unless an implementation-specific trick were used.

    In the next installment, we will analyze an actual run.

    SNB Interactive Series


              Computer Degree Programs for Rewarding IT Careers   
    Technology has invaded almost every aspect of our lives - from businesses which depends on it to run their processes to education which uses it generously for finding new ways of imparting knowledge as well as increasing its reach.

    Those who want to enter into the exciting, ever-changing world and work closely with technology should consider a career in IT. There are different types of computer-related degrees and programs which can train you in a specific area of Information Technology. Here's a quick look at some of the popular computer degree programs that you can consider to start your career in IT.

    Bachelor of Computer Science: This is one of the sought after computer degrees and an apparent choice for many individuals passionate about computers. But before you jump into it, you need to consider if you have the aptitude for this program. A BS in Computer Science requires above average mathematical and analytical skills. As part of your program, you will be learning complex concepts ranging from algorithms and discreet mathematics to programming languages and networking principles. You can choose from a variety of exciting career options like software engineering, network architecture, database management, etc.

    A Bachelor of Computer Science degree typically takes about four years to complete, but some colleges also offer it on a fast-track schedule. You can earn this degree on-campus or enroll for online degree at an accredited institution.

    Computer Engineering: This is an engineering degree that combines elements of both computer science and electrical engineering. This computer degree program is suited for individuals who want to work on the hardware side of computers. Computer hardware engineers are responsible for designing, developing, testing, installing, and maintaining computer hardware. As a result, a computer engineering degree imparts training in electronic engineering, software design and hardware-software integration. The types of courses it covers include computer architecture and organization, digital electronics, circuit analysis, embedded systems, etc. A candidate's science and math skills need to be strong for this degree as well.

    Computer Programming: This computer degree is the most pertinent choice for individuals who are singularly focused on programming. A lot of aspiring programmers gravitate towards this program because it combines the study of programming languages with courses on databases, networks, and Internet applications. This type of degree is available at the Associate's, Bachelor's as well as at the Master's level. An Associate's degree in Computer Programming is the preferred choice for students looking for a quick entry into workforce as it can be earned in two years or less.

    An Associate's of Science in Computer Programming degree qualifies graduates for entry-level programming jobs. Programmers or developers, as they are sometimes referred to, are required to convert a software design into a logical series of instructions called code that will make a computer perform a specific task. In addition to writing new programs, programmers also update, repair, and modify existing programs.

    Computer Technology and Networking: This is another computer degree program that has captured the interest of many students aspiring to become a computer technician. As a computer technician or a support specialist, you help people to use their computers. You install software and tools on machines, fix problems when they arise, and are responsible for maintaining system upkeep. A Computer Technology and Networking degree also trains graduates for the job of a network administrator who designs, installs and supports computer systems in an organization. However, it's possible that you may have to work as a support specialist before you're offered the role of a network or systems administrator.

    This is typically an Associate's program and can be completed in two years' time or even less on a flexible schedule from some colleges. Graduates of this program are encouraged to complete additional professional certifications to expand their knowledge and boost their job prospects.

    Computer Information Systems: This computer program is also available at the Associate's, Bachelor's and Master's level. Depending on the type of degree you earn, you will be responsible for designing, building, and implementing technology in an organization in order to drive its business forward.

    A graduate degree in the field is more focused on the management of information systems in a company. A Master's degree will qualify you for the role of an information systems manager, who plans and directs all IT-related activities in his or her firm. An Associate's degree can help graduates obtain entry-level positions, while more responsible and superior roles are reserved for candidates who have a Bachelor's Degree in CIS.
              Will Computer Grammar Programs Improve Our Writing Performance?   
    Computer grammar programs - can they truly improve our English writing skills? English writing is one of the most important forms of communication today, it is necessary to maintain it correct and professional. Learn more about new ideas that will easily enable you to improve your daily writing assignments.

    Computer grammar programs provide advanced grammar and proofreading capabilities that aren't available with our conventional word processors. Analyzing text for correct grammar is a great challenge for software developers; it requires a massive database as well as smart analyzing algorithms. In most cases these solutions enables us to do the following: text editing, grammatical check, correct spelling, and proper punctuation.

    By using this automatic proofreading technology we gain the following:

    - Helping with critical writing assignments such as job and patent applications.
    - Avoiding common writing mistakes we tend to repeat in our daily writing assignments.
    - Helping us to avoid embarrassing grammar mistakes.

    If we examine it closer we would probably find additional benefits that were not added into this quick list, as this technology keeps improving, bringing us fresh improvements and ideas that help us with our grammar writing and proofreading skills.

    Computer grammar programs help us catch common writing errors as we proofread our writing assignments. No Natural Language Processing technology can get that perfect, but it can definitely help us cover most of our common writing errors. Although it brings many challenges to software developers, we can expect this technology to further develop itself, simply because writing is among the most significant tools that helps us with many of our day-to-day assignments, whether at home, at school, or in the office.
              Computer Repair Programs   
    Whether you are a student who has an assignment deadline, a worker who needs to finish project or a housewife who wants to buy online, computer is the one hard-stop in your daily routine, which makes you 'unstoppable'- you can manage your work, all in one place. However, it can be a real turn-off, if you miss out on your deadline or due date just because your dependable device gave up on you at the oddest of times...

    Now you have 2 options to deal with your computer repair problem, if you are living in Memphis, US.

    1. Short-term solution: You can easily search on Google and call a Memphis computer repair shop and ask for a technician who can give you service at your door step- be it your home or office. This can save you the trouble of disconnecting, dissembling and carrying the whole set up over to the store. Some local Geeks can also come over to give you service at low rates. Most Memphis computer repair websites give you online service by guiding you to solve problems yourself, if it's a minor fault.

    2. Long term solution: This option becomes cost-effective and eventually lucrative in the long run. Be a technician yourself! Here also, you have 2 options.

    a. To get full time degree education in the field of IT to serve as an IT professional
    b. To continue with your current line of studies and/or job and later join courses/programs in universities for an executive program to add skills to your resumes (while solving your home computer problems in jiffy).

    Some of the IT programs with proper certifications, offered in the area of Memphis computer repair are given as follows:

    — CompTIA A+, Network+, Security+
    Prepares students for entry level jobs in: building, repairing, configuring, and troubleshooting of computers and software; networks and their up gradation; operational security measures like cryptography, firewall setup etc.

    — Microsoft Certified Systems Administrator (MCSA)
    To teach launch, maintain and troubleshooting of Microsoft Windows programs.

    — Microsoft Certified Systems Engineer (MCSE)
    this program prepares students for information system that uses Microsoft Windows server with Active Directory and server products in Back Office.

    — Data base administrator courses
    These can be of several types. They provide skills for installing Microsoft SQL server and other single or multi-dimensional databases; accounts, availability, recovery, and reporting; troubleshooting SQL server problems etc.

    — Cisco Network Certification:
    Students are given knowledge of Cisco International Operating system with its concepts and commands as well as set up and operation, LAN/WAN etc.

    — MS Office Specialists:
    This is a very handy course as it's applicable to the daily routine in any career path. It gives students a total command on MS programs such as Word, Excel, Power Point, Access, and Outlook.

    — Oracle Data base Certifications:
    As evident from the name, it gives introduction to data servers like Oracle, SQL, and PL/SQL. It also covers design, operation, maintenance, and troubleshooting of oracle database.

    Whether you live in Memphis or in Olive Branch, Bartlet, Millington, Cordova, German town, Hernando, Hornlake, Southaven, Tunica etc, you will always have Memphis computer repair shops or Memphis computer repair institutes to help you solve your problems.
              Design.Principles   

    Originally posted on: http://geekswithblogs.net/jasonfranks/archive/2008/09/02/design.principles.aspx


    Slight change of plan; I'll get back to the topic I offered last time--"Smarter/stupider UI"--after this one. I think this needs to come first.

    This is my attempt at a  quick, no-bullshit set of principles that determine what makes a good design. None of this is headline news, but I learned this stuff the hard way and I am continually surprised by how many people do not consider thse issues at all. 

    1/ Control Dependencies
    Every module or subsystem should be as independent as possible. Build upwards, not across--each unit is build on top of more basic units, but dependencies to its peers should be carefully and systematically managed. Certainly, they should not ever depend on functionality from layers of code that lie above their own.

    For example, a server is dependent on a database. Logical entities that exists inside the server inherit this dependency--but the relationship between those entities must adhere to strict rules. The stricter those rules the easier it is to create the entities and the easier it is to maintain or replace them. The server should also allow clients to connect and communicate with it, but giving the server detailed knowledge of those clients will lead to lunacy, despair and death.

    2/ Inheritance vs Genericity
    Know when to inherit and when to be generic; know what's an interface, what's an ancestor, and what's an aggregate.

    Any base class that exists to unite two otherwise distinct hierarchies is probably a mistake--there are better options. Perhaps some  this class owns both objects and controls how they interact directly? Perhaps the features you want to enforce should be encapsulated into an interface? Perhaps you can build a a generic, type-independent object that can add the functionality to class belonging to any hierarchy? (Yes, that means 'templates').  Perhaps an adaptor class is required to wrap conforming behaviour around a specific class (The ever popular 'adapter pattern')

    3/ Design Patterns Do Not Maketh A Design
    Having read DESIGN PATTERNS does not make you a good architect. Mere use of design patterns does not make a product well designed. If the object model is incorrect, no design pattern is appropriate--develop the hierarchy first and then determine which patterns are useful to the system.

    If you find yourself in Mad Scientist terrirtory trying to crossbreed two distinct design patterns you are probably barking up the wrong tree. Clean the whiteboard, get a haircut and think again.

    4/ Occam's Razor
    Write as tight as you can: no hobbyhorse coding, no playpen libraries. More code = more dev time + more bugs + more QA. That adds up to a greater possibility of failure. Plan for the future, but don't build what isn't necessary. The simpler solution is better.

    The cliche about 'premature optimization' belongs here--code needs to be correct before you worry about how fast it is. Counting CPU cycles is pointless if the system doesn't work. A cleaner, slimmer design will perform better than a fatter, more complex one; it really is that simple.

    5/ Magic Happens

    There's no getting around it, sometimes 'magic' is necessary. The programmer waves his wand and the soup congeals into catfood.  Sometimes you discover a hole in the design, or there's a corner case you didn't expect, or the pieces just don't fit the way you thought they would, and you just have to hack (excuse me, 'magic') them together to make a deadline.

    As the Amazing Jonothan says, "Sometimes magic sounds like tape." Duct tape, to be precise.

    There's no avoiding it, but if your design calls for magic from the outset it's not really a design--it's not a cauldron of  soup, it's a crock of the same old shit.

    -- JF


              Welcome to Dev.Hell   

    Originally posted on: http://geekswithblogs.net/jasonfranks/archive/2007/06/25/welcome-to-dev.hell.aspx


    Welcome to Dev.Hell. I'm the Advocate, but you can call me Jason. Other people call me a complaining nerd. It's all good.

    For the last 8 years I've been a professional Windows applications developer.

    I live in Australia, although I just spent five years working in the USA. I've worked your standard client/server n-tier database apps, I've worked on CAD/engineering applications, I've worked on network security apps, and right now I'm working in the physical security space. I have a couple of side projects that I can't or won't talk about here. I've worked for tiny companies of fifteen souls or less, I've worked for public companies with thousands. I've worked solo and I've worked on teams of five, fifteen and fifty. I've been the junior guy, I've been the behind-the-scenes-emergency-fixer-upper-guy, I've been dev lead and architect. I've mentored younger guys, filtered resumes, and tested developers twice my age in interview situations (which has been both embarassing and educational). I've seen quite a lot of stuff in a relatively short career.

    I'm mostly a Visual C++ guy these days, although I do a bit of C#/.NET. 

    I am also a writer. I write prose, I write comics, I write articles about comics. If you're interested in that, you're more than welcome to check out my main website. I've decided to shift my blogging about technology matters over here because I'm looking for a more appropriate audience that the comics crowd that read my other blog, and I will indeed be retooling and some of my old postings from there for this blog.

    So that's my professional resume. Here's my rationale:

    I'm a committed developer. It's what I do; I don't want to be part of the business side of things...  but I live and work in the real world, and business wants to be part of what I do. While I do enjoy getting paid every week, the results usually aren't pretty. I love writing software, but these external forces just make everything so damned difficult. Hence this blog: I want to explore the tension between engineering and business, government, finance, education, and that Klingon-speaking, caffeine-swilling, acne-encrusted beast we lovingly call Development Culture.

    I'm not going to get especially technical--there's plenty of venues for that already all across the length and breadth of the internet. I'm sure if anything particularly  interesting or aggravating crops up it'll find it's way here, but that's not really what I want to do on this blog. I want to raise awareness of how the Industry succeeds and fails, where it all seems to be going, and what has to happen befire we can dig our way out of Dev.Hell and ascend to Tech.Paradise.

    See you in the funny pages.

    -- JF


    "We are all in the gutter, but some of us are looking at the stars." Oscar Wilde
    "Opinions are like assholes--everybody's got one." Dirty Harry


              Now, BU students can pay most fees online   
    Students of Bangalore University (BU) and its affiliated colleges will no longer have to queue up on the Jnanabharathi campus to submit challans as the university on Saturday launched a software for online payment of fees.

    As of now, all fees, except examination fee, convocation fee and affiliation fee, can be paid using the system. The university's finance officer has promised that within 10 days, online payment of the remaining fees will also be facilitated.

    The software has been developed by InHawk IT solutions Pvt Ltd with State Bank of India (SBI) and Axis Bank payment gateways.

    To pay the fees online, students should visit buofc.inhawk.com and log in. The system is integrated with the student database and will automatically display the student's fee details and the amount to be paid. Payment can be made with debit/credit cards and internet banking. Offline payments can be made at the branches of SBI and Axis Bank using the challans generated online. The present method of fee collection through demand draft (DD) will continue for the next three months to allow for a smooth transition. Affiliated colleges will be provided training on the use of software to make the payments.

    A fee of Rs 13 will be collected by the service provider for each transaction. According to university officials, the online payment will be beneficial to students and colleges as the DD charges are higher. "For a transaction of Rs 10 lakh, colleges incur demand draft charges of up to Rs 2,000. With the software, they will be charged just Rs 13 even if they are transferring the fees collected from 1,500 students," a university official said.

              Purchase Ledger   
    A manufacturing company based in Erith, Dartford is seeking a Purchase Ledger to join their busy team. The salary is £20-£21k per annum. 37.5 hour week Monday to Friday 8.30 to 5pm. 20 Days Holiday which accrues yearly by one day following a whole year of service to a max of 25. I am looking for a positive upbeat person that can process high volumes of invoices. You must be Computer literate, if you have knowledge of SAP it would be a bonus. Full training and support will be given. We need someone that fully understands the basic nature of purchase ledger and all that it involves. Job duties: Reporting to the Purchase Ledger Supervisor within a team of 2 purchase ledgers and 20 within the wider finance team. A very busy department servicing a live customer database in excess of 16,000. - Processing invoices - Payment runs - Reconciliation This position will start of as a temporary role with a view to become permanent after completing a satisfactory 4 week trial period. If you are available immediately to interview then please apply today for consideration. Reed Specialist Recruitment Limited is an employment agency and employment business
              Database Tour Pro 8.2.4.33   

              Program Coordinator, FDA and Third Party Tobacco Retail Inspection Program - JBS International, Inc. - Maryland   
    Experience in conducting data analysis, preferably using SQL server databases. A Bachelor's degree, in business administration, public administration, criminal...
    From JBS International, Inc. - Fri, 02 Jun 2017 03:23:28 GMT - View all Maryland jobs
              FAQ   
    Akismet checks your comments and contact form submissions against our global database of spam to protect you and your site from malicious content.
              Debunking a Few Well Known Car Myths   

    The infographic looks quite attractive and informative. The information presented are concise and short. So, it means that each point is explained in detailed manner...

    The post Debunking a Few Well Known Car Myths appeared first on Infographic Database.


              Software Engineer - Aveva - Engineer, BC   
    PML (AVEVA application macro language). Development Manager, Database Team, Cambridge....
    From Aveva - Sun, 11 Jun 2017 09:43:41 GMT - View all Engineer, BC jobs
              Database Design Grundlæggende kursus   
    none
              Senior Oracle Database Administrator - Wesco Aircraft - Austin, TX   
    Deliver expert level assistance to the Wesco staff across the globe, including but not limited to Oracle 10g, Oracle 11g or Oracle 12c....
    From Wesco Aircraft - Mon, 12 Jun 2017 20:10:00 GMT - View all Austin, TX jobs
              Software Project Coordinator - Miles Technologies - Moorestown, NJ   
    General knowledge of database systems and concepts, such as MS SQL Server, MS Access, or Oracle. Assess a business’ needs and translate them into solutions...
    From Miles Technologies - Fri, 23 Jun 2017 03:10:43 GMT - View all Moorestown, NJ jobs
              Bus Intelligence Developer I - Federal Home Loan Bank of NY - Jersey City, NJ   
    Knowledge of RDBMS – Oracle and MS SQL Server preferred. Assist the database administrator where appropriate (in cases where there are deficient DBA resources)...
    From Federal Home Loan Bank of NY - Wed, 10 May 2017 13:24:20 GMT - View all Jersey City, NJ jobs
              Developer - Tradeweb Markets LLC - Jersey City, NJ   
    Oracle DB preferred (Sybase or other relational database is ok). Responsibilities/Accountabilities:....
    From Tradeweb - Thu, 30 Mar 2017 00:44:57 GMT - View all Jersey City, NJ jobs
              Internal Controls Senior Analyst NA - Mondelez International - Toronto, ON   
    Lead SOX reviews including updating control documentation in GRC database and providing oversight to the IC COE team for remote testing....
    From Mondelēz International - Sat, 24 Jun 2017 10:31:57 GMT - View all Toronto, ON jobs
              Systems Analyst (Teradata Database Developer) - McCain Foods (Canada) - Florenceville, NB   
    McCain Foods is seeking a Systems Analyst, specialized in Teradata database development, to contribute to the success of our Enterprise Data Warehouse (EDW)...
    From McCain - Tue, 04 Apr 2017 12:46:27 GMT - View all Florenceville, NB jobs
              Marketing Manager - DAP Products Inc. - Ontario   
    This includes development and management of a database warehouse of consumer focused and retail sales fact based data....
    From DAP Products Inc. - Wed, 07 Jun 2017 00:02:06 GMT - View all Ontario jobs
              FAQ   
    Akismet checks your comments and contact form submissions against our global database of spam to protect you and your site from malicious content.
              TRAFFIC MANAGEMENT SYSTEM OPERATOR (ONCALL) - Ministry of Transportation - Ottawa East, ON   
    You are proficient with computers and with software such as word processing, spreadsheets, database applications and browser based user interface.... $24.77 - $27.46 an hour
    From Ontario Public Service - Mon, 26 Jun 2017 09:33:48 GMT - View all Ottawa East, ON jobs
                 
    Greenspun: "I'm thinking of writing a tutorial on how to use the Windows XP file system as a photo database."
              Automotive Parts Counter Consultant   
    PA-Pittsburgh, Bentley Pittsburgh/Maserati of Pittsburgh is in need of a full-time parts and accessories consultant. We offer medical benefits after 90-days, and a 401K plan after 1-year as well. Job Purpose: Sells automotive parts by taking and clarifying customer orders; retrieving and selling new and replacement automotive parts; receiving and recording new parts inventory; maintaining parts databases; mainta
              Litigation Support Specialist - Miller Thomson - Canada   
    We are seeking a Litigation Support Specialist to join our Litigation team. Troubleshoot litigation support applications and databases....
    From Miller Thomson - Fri, 31 Mar 2017 02:47:06 GMT - View all Canada jobs
              Product Database Specialist - Teknion Limited - Toronto, ON   
    Baan, Operations, PPG etc. Teknion creates furniture that connects people, technology and spaces....
    From Teknion Limited - Fri, 23 Jun 2017 23:43:48 GMT - View all Toronto, ON jobs
              Kidney Care Advocate- Full Time - Export PA - Fresenius Medical Care - Export, PA   
    Responsible to ensure accurate and timely documentation of patient interactions and status, through maintenance of SAP database, and/or electronic medical...
    From Fresenius Medical Care - Sat, 24 Jun 2017 20:29:34 GMT - View all Export, PA jobs
              Senior Oracle Database Administrator - Wesco Aircraft - Austin, TX   
    Previous work experience with Oracle JDEdwards ERP system, SAP Oracle database using BRTOOLS, Oracle 11g on Solaris10 systems administration, Linux systems...
    From Wesco Aircraft - Mon, 12 Jun 2017 20:10:00 GMT - View all Austin, TX jobs
              (USA-MS-Jackson) Specialist, Flt CAP/PCRS   
    **Date:** Jun 30, 2017 **Location:** Jackson, MS, US **Company:** Entergy **Primary Location:** Mississippi-Jackson **Job Function** :Other **MRV Minimum Salary** :$88,000.00 **MRV Maximum Salary** :$132,000.00 **FLSA Status** :Professional **Relocation Option:** Approved in accordance with the Entergy guidelines **Union description/code** :NON BARGAINING UNIT-NBU **Number of Openings** :1.00 **Req ID:** 71902 **Travel Percentage** :25% to 50% **JOB SUMMARY/PURPOSE:** This position is responsible for maintaining the corrective action program database, assisting and training site personnel with the use of the database, and assisting with governance and oversight of the corrective action program. **JOB DUTIES/RESPONSIBILITIES:** Establishing the standards and writing material related to the corrective action program database (PCRS). Assisting the sites with maintaining proper setup and utilization of PCRS modules. Ensuring corrective action program personnel receive appropriate training on the functionality of PCRS and how to perform administrator duties. Fill the role of business owner lead of PCRS when software changes are needed. Assist the CFAM CAP/OE with governance and oversight activities for the corrective action program. Representing the CFAM as directed, performing assessments of corrective action programs and performance analysis of the corrective action program. **MINIMUM REQUIREMENTS:** **Minimum education required of the position** B.S. Degree in Engineering or related technical degree may be credited for up to 4 years of experience, or equivalent work experience. **Minimum experience required of the position** 6 years commercial or military nuclear experience. **Minimum knowledge, skills and abilities required of the position** Technical understanding of nuclear generation principles and operation. Familiar with industry recognized root cause analysis methods and techniques. Desired: Working knowledge of computer applications, including Microsoft Excel, Word, Access, PowerPoint and PCRS. **Any certificates, licenses, etc. required for the position** None **WORKING CONDITIONS:** As a provider of essential services, Entergy expects its employees to be available to work additional hours, to work in alternate locations, and/or to perform additional duties in connection with storms, outages, emergencies, or other situations as deemed necessary by the company. Exempt employees may not be paid overtime associated with such duties. **EEO Statement:** The Entergy System of Companies provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, gender, sexual orientation, gender identity or expression, national origin, age, disability, genetic information, marital status, amnesty, or status as a protected veteran in accordance with applicable federal, state and local laws. The Entergy System of Companies complies with applicable state and local laws governing non-discrimination in employment in every location in which the company has facilities. This policy applies to all terms and conditions of employment including, but not limited to, recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training. The Entergy System of Companies expressly prohibits any form of unlawful employee harassment based on race, color, religion, gender, sexual orientation, gender identity or expression, national origin, age, genetic information, disability, or veteran status. Improper interference with the ability of the Entergy System of Company employees to perform their expected job duties is absolutely not tolerated. Entergy provides reasonable accommodations for online applicants. Requests for a reasonable accommodation may be made orally or in writing by an applicant, employee, or third party on his or her behalf. **Additional Responsibilities:** As a provider of essential services, Entergy expects its employees to be available to work additional hours, to work in alternate locations, and/or to perform additional duties in connection with storms, outages, emergencies, or other situations as deemed necessary by the company. Exempt employees may not be paid overtime associated with such duties. **Pre-employment Testing:** One way that Entergy has found to identify and assess the abilities and skills needed for certain jobs is through pre-employment testing. If this position does require an EEI test, the type of test will be located under the qualifications section of the job posting. If you are invited to a test session, we strongly recommend you review and complete the practice test as well as review the testing brochure for your respective test. The test brochure will give you critical information on the test such as time allocated and number of questions. Also, keep in mind that the actual test is timed; you should practice timing yourself while doing the practice tests. The practice test information and test brochures can be located by going to the EEI website, http://www.eei.org/practicetests , Logon ID: entergy, password: practice test (2 words). In addition to EEI testing there is also Fit-for-Duty testing which will identify and assess the abilities and skills needed for certain jobs. If this position does require Fit-for-Duty testing, the type of test will be located under the qualifications section of the job posting. **Nearest Major Market:** Jackson Mississippi **Job Segment:** Engineer, Nuclear Engineering, Nuclear, Testing, Database, Engineering, Energy, Technology
              (USA-MS-Jackson) Staff Services Engineer   
    **About Us:** GE is the world’s Digital Industrial Company, transforming industry with software-defined machines and solutions that are connected, responsive and predictive. Through our people, leadership development, services, technology and scale, GE delivers better outcomes for global customers by speaking the language of industry. It is not about your career… it is not about your job title… it is about who you are…. It is about the impact you are going to make on the world. You want to go into uncharted waters… do things that haven’t been done to make yours and someone else's life better. GE has been doing that for decades! We will continue to do so! We are the world’s digital industrial company. GE Oil & Gas fuels the future. We push the boundaries of technology to bring energy to the world. We are inventing the next industrial era in the oil and gas sector. In our labs and factories, and in the field, we constantly push the boundaries of technology to solve today’s toughest operational & commercial challenges. We have the skills, knowledge and technical expertise to bring together the physical and digital worlds to fuel the future. GE Oil & Gas is a fullstream company, working from exploration and production to downstream. A $19 billion leader designed for a world of complex resources. The deeper, the hotter, the more remote, the more logistically difficult or environmentally sensitive the challenge – the more GE Oil & Gas can help. Through project management expertise and technology innovation, we work to help lower costs, make things faster, simpler and more productive for our customers. In today’s era of complex resources, the deeper, the hotter, the more remote, the more logistically difficult or environmentally sensitive the challenge – the more we can help. The GE scale helps us bring new solutions to market quicker to help our customers adapt to the industry’s changing environment. We go where you go and operate side-by-side in 120 countries. The closer we are to our customers, the quicker we can anticipate and solve their challenges. We’re at work today, to ensure the next generation is equipped and empowered to go further and deeper, helping to fuel the future. Follow GE Oil & Gas on Twitter @GE_OilandGas. GE is diversity. We aim to employ the worlds’ brightest minds to help us create an unlimited source of ideas and opportunities. We believe in hiring talented people of varied backgrounds, experiences and styles - people like you! **Role Summary:** In this role, you will work closely with customers to understand and define requirements, develop technical proposals, and set expectations for software implementations/upgrades and interface projects. You will act as the technical lead, as well as an individual contributor, on these engagements, contributing to software implementation, troubleshooting, customization, and integration into customer systems while balancing scope versus project time and resource commitments. **Essential Responsibilities:** You will act as an SME for the organization, coordinating cross-organizationally and independently mentoring peers. + Provide expert knowledge and experience to collaborate with the customer to identify technical requirements and estimate the technical effort required to implement complex software solutions + Provide technical support to applications + Perform all installation and/or programming tasks related to agreed interface & conversion specifications and/or application assignments, including agreed upon system tailoring and customizations + Collaborate with Project Managers and Services Consultants throughout the project to identify and scope applications changes while adhering to the change management process + Execute on and serve as technical lead for the implementation of software solutions. + Own technical deliverables during the entire lifecycle of the projects. + Engage throughout the full lifecycle of assigned projects, influencing decisions on design, and functionality to keep projects on track in terms of budget, time, and customer expectations. + Effectively leverage product capability, driving standardization, limiting customization, and maximizing reuse of content developed for previous solutions. + Interact with Product Development Team, Commercial Team, Customers, Solution Providers (Partner / Integrators), and other cross-functional teams as required for the solution and implement processes to ensure best use of GE Digital products and services. + Effectively communicate both verbally and in writing with peers and team members as an inclusive team member, supporting pre-sale strategy (as needed) and project execution. + Act as a technical leader or mentor on complex, integrated customer implementations, either within individual project teams and/or cross-organizationally. + Effectively apply GE Digital execution methodology and project standards. + Maintain & continuously update technical skills and knowledge. + Work independently as well as part of the team. + Maintain strong customer relational and communication skills **Qualifications/Requirements:** Basic Qualifications: + Bachelor’s degree and 5+ years of experience in software services, or equivalent (defined as: High School Diploma/GED and 7+ years progressive experience in software services). + 5+ years’ experience with Windows, .NET, SQL or Oracle databases, Intersystems CachéTM Object Script, C++, Java, HL7, Web Services, SOA technologies, or similar programming languages/technologiesEligibility Requirements: + Legal authorization to work in the U.S. is required. We will not sponsor individuals for employment visas, now or in the future, for this job + Must be willing to work out of an office location in Houston, TX or Foxborough, MA **Desired Characteristics:** + Software skills in software analysis, design, methodology, and architecture + Experience in software analysis, design, quality assurance, architecture, and Agile development methodology + Demonstrated ability to learn new software development practices, languages, and tools + Experience with enterprise software and large distributed systems + Strong spoken and written English language skills + Experience with development in an ISO-certified environment + Practical experience with web and other technologies such as HTML 5, JavaScript, CSS and JQuery + 5+ years’ experience in software analysis, design, methodology, development and architecture + Programming experience in software development with Microsoft Visual Studio / Development tools, Eclipse, or C# programming as well as, strong familiarization with Digital Historian.Technical Expertise: + Applies technical fundamentals in specific projects. Applies architecture development process & methods at various stages of the project lifecycle. Applies architecture adoption best practices on multiple programs. + Possesses proven ability to deploy in both on premise and cloud based environments. + Demonstrates ability to diagnose and solve intermediate to advanced issues. Can isolate highly complex scenarios down to actionable items and drive solutions. + Coaches others on troubleshooting and problem solving techniques. + Ability to communicate product capabilities internally and in customer facing situations.Business Acumen: + Anticipates potential risks and obstacles and resolves proactively in order to ensure smooth project delivery. Maintains high levels of customer satisfaction across all projects. Addresses customer concerns quickly and effectively; at times, anticipates customer concerns before they become issues. + Leverages knowledge of market and customer segment in order to establish further credibility in the eyes of the customer. + Plans and facilitates collaborative discussions with client and others within GE to identity and prioritize client's overall business needs. + Understands when and when not to depart from the standard provision of deliverables. Implements scalable systems according to governance and standards guidelines/boundaries, collaborating with others as necessary.Leadership: + Helps team members understand their contributions in support of the broader direction. + Serves as a mentor to newer team members in support of business goals and objectives and product development roadmap + Continuously measures the completion rate of personal and team deliverables and compares them to the scheduled commitments. + Effectively balances different, competing objectives. + Engages positively across multiple departments, GE businesses and customers as needed to manage conflict and establish clarity, vision, and mutual trust in order to achieve a business goal. + Adjusts information (e.g. level of complexity) and story to align with audience. Produces functional area information in sufficient detail for cross-functional teams to utilize, using presentation and storytelling concepts.Personal Attributes: + Engages with product related problems and questions in a disciplined and rigorous manner + Persists on completion of endeavors, especially in the face of overwhelming odds and setbacks. Pushes self for individual results and others through team spirit. + Increases client engagement to further drive the pace and focus required to achieve business priorities and uncover desired outcomes for both the customer and GE; utilizes business acumen and domain experience to advise the customer on critical success factors for the initiative at hand; continuously influences the customer to think ahead on what is needed to acquire, deploy, and utilize the solution.\#DTR **Locations:** United States; Massachusetts, Texas; Houston, FoxboroughGE offers a great work environment, professional development, challenging careers, and competitive compensation. GE is an Equal Opportunity Employer at http://www1.eeoc.gov/employers/upload/eeoc_self_print_poster.pdf . Employment decisions are made without regard to race, color, religion, national or ethnic origin, sex, sexual orientation, gender identity or expression, age, disability, protected veteran status or other characteristics protected by law.GE will only employ those who are legally authorized to work in the United States for this opening. Any offer of employment is conditional upon the successful completion​ of a background investigation and drug screen.
              (USA-MS-Jackson) Center Administrator / Hinds County   
    This Position is for a Center Administrator with an early childhood development program serving Hinds County. .Under the direction and supervision of the Early Head Start Director, this position has the responsibility of supervising and managing the overall day-to-day operations of assigned center to ensure delivery of quality services to enrolled children and families.Desired Qualifications: Masters Degree in Early Childhood Education (ECE), Human Resources, or equivalent combination with three (3) years of demonstrated experience in management, administration and supervision of a childcare facility, or related social service program.Minimum Qualifications: Bachelors Degree in ECE, Human Resources, OMB, or equivalent combination with Five (5) years of demonstrated experience in management and supervision of a childcare facility, office management, case management, or related social service program; Ability to accurately comprehend oral and written assignments and interpretation of policies and procedures, Head Start Performance Standards, federal, state and local regulations; Strong computer skills (Microsoft products to include spreadsheets, databases, presentation and word processing.)
              (USA-MS-Jackson) Center Clerk / Hinds County   
    This Position is for a Center Clerk with an early childhood development center located in Hinds County.Summary: Under the supervision of the Center Administrator, this position has the responsibility of overseeing the Centers main office reception area and performing other administrative and clerical functions.Desired Qualifications: Associates degree in Business Administration / Office Technology or related discipline, with One or more years experience in general office practices with emphasis reception, switchboard; Strong computer skills using PCs and variety of software applications, word processing, databases and email; Ability to work in a fast paced multi-task environment; Excellent oral and written communication skills required.
              Wildlife Biologist II – Baffin - GOVERNMENT OF NUNAVUT - Pond Inlet, NU   
    Applied knowledge of statistical procedures, applications, data tabulation, computer applications coupled with the ability to establish databases and geographic... $97,734 a year
    From Indeed - Fri, 17 Mar 2017 19:01:25 GMT - View all Pond Inlet, NU jobs
              Two-year increase in homicide   
    I examined the top 55 cities in America. Collectively 51-million people live in those cities, that is roughly 1/6th of America's population. Between 2014 and 2016, those cities saw a 23 percent increase in homicide. (6,977 to 8,612)

    In terms of raw numbers, the cities with the largest increases in people being killed are Chicago, Kansas City, Houston, Baltimore, and Memphis. [Were one to take these top five cities out of the equation -- and there's not any moral or good statistical reason for doing so, but just for fun -- the rest of the cities would still see, on average, a 2-year 15 percent increase. Even that would be worrisome. So, no. It's not just Chicago.]

    I dropped cities with fewer than 40 or more murders in 2016 because statistically, a small n leads to overly dramatic year to year changes.

    What's left is 43 cities ranging from little Richmond (220,000 people) to big NYC (8.5 million), from safe San Diego (homicide rate 3.5 per 100,000) to dangerous St. Louis and Baltimore (more than 50 homicides per 100,000).

    Thirty-nine of 43 cities saw an increase. Three (Pittsburgh, Boston, Columbus) didn't. New York City is unchanged.



    2-year change in homicide rate (per 100,000), 2014-2016

    Albuquerque: +103% | Atlanta: +19% | Austin: +25% | Bakersfield: +153% | Baltimore: +49% | Boston: -15% | Charlotte: +60% | Chicago: +80% | Cleveland: +33% | Columbus: -11% | Dallas: +49% | Denver: +97% | Detroit: +6% | Durham: +95% | El Paso: -19% | Fort Worth: +35% | Fresno: -17% | Hampton Roads, VA: +39% | Houston: +44% | Indianapolis: +10% | Jacksonville: +25% | Kansas City: +140% | Las Vegas: +44% | Long Beach: +28% | Los Angeles: +13% | Louisville: +73% | Memphis: +63% | Miami: +6% | Milwaukee: +68% | Minneapolis: +19% | Nashville: +83% | New Orleans: +17% | New York: +1% | Oakland: +10% | Oklahoma: +City: +96% | Omaha: -6% | Philadelphia: +11% | Phoenix: +28% | Pittsburgh: -20% | Portland: -23% | Raleigh: +29% | Richmond: +45% | Sacramento: +46% | San Antonio: +36% | San Diego: +53% | San Francisco: +24% | San Jose: +50% | Seattle: -35% | St. Louis: +18% | Tucson: -11% | Tulsa: +76% | Washington: +29% | Wichita: +31%



    In addition to the UCR, here are some of my source. Corrections welcome.
    https://www.abqjournal.com/923137/city-sees-highest-number-of-murders-in-20-years.html
    http://www.kerngoldenempire.com/homicide-tracker http://www.city-data.com/crime/crime-Bakersfield-California.html
    http://www.cleveland.com/metro/index.ssf/2017/01/year_in_review_homicides_surge.html
    http://www.newsobserver.com/news/local/community/durham-news/article123339719.html
    http://www.elpasotimes.com/story/news/crime/2017/03/23/rash-homicides-anomaly-police-say/99555590/
    http://www.star-telegram.com/news/local/community/fort-worth/article128019874.html
    http://abc30.com/news/domestic-violence-related-murders-rise-sharply-in-fresno-during-2016/1679846/
    http://fox59.com/2017/03/02/early-2017-homicide-total-climbs-as-mayor-hogsett-stays-the-course/
    http://jacksonville.com/homicides/2016
    http://www.kshb.com/homicide-tracker-2016
    http://homicide.latimes.com/neighborhood/long-beach/year/2016
    http://www.miamiherald.com/news/local/crime/article104679101.html
    https://projects.jsonline.com/apps/Milwaukee-Homicide-Database/
    http://www.startribune.com/statistics-show-minneapolis-violent-crime-edged-up-in-2016/409711555/
    http://www.nola.com/crime/index.ssf/2017/01/new_orleans_finishes_2016_with.html
    https://oaklandmofo.com/blog/oakland-homicide-count-is-rising
    http://dataomaha.com/homicides/2015
    https://www.phillypolice.com/crime-maps-stats/
    http://www.azcentral.com/story/news/local/phoenix/2017/03/29/maricopa-county-phoenix-area-homicide-map-2017/99735018/
    https://newsinteractive.post-gazette.com/homicide/
    http://koin.com/2017/01/01/4-of-portlands-20-homicides-from-2016-remain-unsolved/
    http://www.kcra.com/article/meet-the-tiniest-deer-being-nursed-back-to-health/10247250
    http://www.sandiegouniontribune.com/news/public-safety/sd-me-county-homicides-20170226-story.html
    https://www.tucsonaz.gov/files/police/pt1_16_summarytable_0.pdf
    http://www.kjrh.com/news/local-news/tulsa-homicides-in-2016-interactive-map-shows-location-of-homicides-during-record-year
    http://www.sandiegouniontribune.com/news/public-safety/sd-me-county-homicides-20170226-story.html
    http://wtkr.com/2015/12/22/norfolk-is-the-deadliest-city-in-hampton-roads/
              Web Application Developer / Backend Programmer - Marketing Results - Henderson, NV   
    Marketing Results is a gaming industry pioneer in high-tech consumer database marketing. We are looking for a full-time IT professional who seeks to apply
    From Indeed - Thu, 15 Jun 2017 21:31:12 GMT - View all Henderson, NV jobs
              DataBase Icons Pack 1.0   
    Набор иконок для создания интерфейсов баз данных и веб сайтов.
              Web Application Developer / Backend Programmer - Marketing Results - Henderson, NV   
    Marketing Results is a gaming industry pioneer in high-tech consumer database marketing. We are looking for a full-time IT professional who seeks to apply
    From Indeed - Thu, 15 Jun 2017 21:31:12 GMT - View all Henderson, NV jobs
              Putin extends embargo on Western food imports through 2018   

    Putin extends embargo on Western food imports through 2018Russian President Vladimir Putin on Friday extended Moscow's embargo on food products from the West until the end of 2018, continuing its policy of retaliation for sanctions over Ukraine. A decree signed by Putin and posted in the official government database states that the embargo on produce, dairy, meat and most other foods will now stretch to December 31, 2018. The move comes days after the European Union formally rolled over damaging economic sanctions against Russia, and a week before Putin is set to hold his first meeting with US President Donald Trump at the G20 summit in Hamburg.



              Reflections on PWN2Own 2016: the state and future of computer security   

    Another year, another Pwn2Own contest.

    TL;DR results for 2016:

    • Prize money: about half a million USD.
    • All major browsers successfully exploited: Chrome, Safari, Edge
    • All attacks bypassed all exploitation countermeasures (e.g., Sandboxing, address randomization) to successfully escalate all the way to root/SYSTEM level privileges
    • Nobody broke through the VM

    Reflections:

    1. Kernel security sucks and will always suck.

      Security mechanisms enforced by the kernel have more holes than swiss cheese. From the point of view of an advanced attacker code running in a "sandbox" as an unprivileged user is going to escalate to root/SYSTEM level privileges 100% of the time.

      There are too many lines of code, the attack surface is too large and human programmers are too imperfect.

      The kernel is an unreliable primitive from a security standpoint. You just can't trust it.

    2. End-point security is still terrible and the same will be true 10 years from now (hello Pwn2Own 2026!) unless there is a radical change in software architecture that acknolwedges the cold hard realities of computer security over the wishy washy desires of senior executives.

      • Contests like Pwn2Own are just showing us the tip of our collective vulnerability iceberg: there are hundreds if not thousands of zero day exploitable holes lurking under the surface of all sufficiently complex software. Especially the software implemented in high-performing yet error prone low-level languages. This includes all browsers and operating systems. Quinn Norton has it exactly right: Everything is broken.

        To get a hint of what lurks beneath the surface, read up on VUPEN security, now rebranded zerodium, a 0day market which pays top dollar (up to 1 million USD) to hoard exploits and lists the NSA amongst its clients. By comparison the bug bounties offered by most vendors are chump change.

      • It gets worse. Even if a genie granted us one wish and patched all existing vulnerabilities that wouldn't help for long because software is a fast moving target. Thanks to new development, vulnerabilities are likely opening up at a faster rate than they're being detected and patched.

      • Any conventional up-to-date computer with a browser can be compromised if you're willing to make the effort to develop zero day exploits and risk sacrificing the exploit if your attack is detected.

        Speaking of detection, unless you're attacking Kaspersky or other high-value targets it usually won't be and even then the exploits you sacrifice are probably just a tiny part of your arsenal as an advanced attacker. Case in point, the attackers that went after Kaspersky sacrificed multiple zero days in their attempt. They had to know there was a high risk of detection but they took the risk anyway. Why? Kaspersky think it was hubris, but I'll bet it's because they could afford to lose a handful of zero days. There's more where those came from.

      • Nearly everyone in the world is always just one wrong click away from being totally pwned.

        Advanced attackers are unimpressed that your system is fully patched. For high risk applications being fully patched does as much good as running an antivirus. Which isn't saying much.

        What you're really achieving when you play the security patch treadmill game is that you're undemocratizing illict access to your systems. Keeping the script kiddies at bay while maybe forcing more advanced attackers to factor in the risk of sacrificing a zero day from their arsenal. That's it.

      • The probable ubiquity of hardware backdoors in Intel & AMD chipsets is in practice somewhat irrelevant, since software is by far the weakest link in the chain and will remain so for the foreseeable future.

    3. Vulnerabilities in low-level (C/C++) code are still extremely relevant and the cost of attack is pretty low for client-side and privilege escalation attacks. A few weeks of a single skilled researcher's time.

      By now all the big companies have strong security awareness and yet none of them are managing to prevent modestly motivated attackers from achieving full remote code execution with system privileges.

      This state of affairs will not change until the fundamental security architecture of our systems changes. I expect to see more hardware enforced containment baked into the operating systems of the future.

      Examples of this trend in the wild:

      • Qubes OS
      • Microsoft Windows 10 Enterprise using the hypervisor to secure the LSA. This is mostly security theatre at present, but if the trend continues it could be useful.
    4. The stats for publically released exploits don't tell the whole story

      If you look at the stats for the exploits being publically released you'll notice low-level vulnerabilities have gone way down. I used to take that as a sign that there were less of these issues to exploit, and that's probably true to a degree, but there's an important cultural and economic aspect to this as well.

      I suspect part of the reason we're not seeing more Pwn2Own level exploits being released in public is that a lucrative private market has risen to disincentivize free disclosure while simultaneously, the cost of fully weaponizing vulnerabilities has risen due to exploitation countermeasures. The people willing and capable of paying the toll have better uses for their skills than giving them away. Like selling exploits privately for up to a million dollars.

    5. Containment is the only realistic defensive strategy and hypervisors are the only semi-reliable primitive from which you can architect reasonably secured systems.

      Sure, there are likely undiscovered zero day "escape from VM" vulnerabilities in all of them, but hypervisors are a much smaller and simpler than operating systems kernels so they have much smaller attack surfaces.

      They're also not moving as fast as other targets so stamping out all the exploitable bugs should be an achievable goal eventually.

      VMs are also easy to set up as honeypots since the host has complete transparent access to all the guests resources, but not vice versa. Attackers will think long and hard before risking the sacrifice of a zero day in a hypervisor.

    6. Decentralization is a good thing because big organizations of all stripes can not be trusted to resist attack, uphold their own policies or keep our secrets.

      Their attack surface is too large and too complicated. Too many assumptions have to hold for their security not to crumble like a house of cards in the face of an advanced attack.

      Since all of the big companies are eating their own vulnerable dog food (and each others) and they're such irresistibly juicy targets we should assume they are all deeply compromised by a plethora of intelligence agencies, organized crime and clever individuals.

      The degree to which it is reasonable to let someone else safeguard your secrets is not just how much you trust them not to abuse that power themselves, but also how much you trust them not to be abused.

      That should be prime and center in the discussion regarding mass surveillance, government mandated backdoors and how much of our private information we feel safe handing over to companies like Google and Facebook. Well intentioned checks and balances at a legal level won't protect against hackers that have pwned your sysadmin's laptop.

      I believe the problem with trusting big organizations is inherently unfixable.

      Like most of us, big organizations will always prioritize getting things done over a serious attempt at closing off all avenues of attack, which is the way it should be. It's also what public opinion and public markets demand. Companies that over-prioritize security will go out of business. Governments that over-prioritize security may end up looking like North Korea.

      But in a world where we don't collectively trust big organizations to maintain our security and keep our secrets, attacks, while still possible, would be much harder to pull off. They'd have to pick us off one by one.

    7. If a true security renaissance ever takes place the driving force will not be personal computers or mobile computing but self driving cars and their like.

      Autonomous self-driving cars will give hackers the power of life and death over anyone that uses them and eventually over anyone that shares the road with these hackable computers on wheels.

      Just think about that for a moment. Plausibly deniable death from afar. An unfortunate accident or the perfect crime?

      On the other hand, so many people are killed in road accidents due to human error that society may accept/repress that risk and work to raise the bar so assassination by hacking is something only the most rich and powerful actually have to worry about. Gulp. I hope.


              A Bassoon from Down Under   


    Subject, Place, and Singularity.
    Those are the qualities that make
    a premium collectable photograph.
     
    The unusual subject here
    is a gentleman holding a bassoon,
    an instrument rarely seen
    in cabinet card portrait photos.

    The curious singularity
    is his magnificent long mustache
    curled like the bocal on his instrument.

    But it is the unexpected place
    where the image was taken
    that makes this a unique photograph.

    Australia.






    The man sits in a relaxed pose,
    cross legged on a low chair,
    gazing to his right.
    He is dressed
    in formal white tie and tail coat,
    with a boutonniere on his lapel.
    His oiled hair is short
    and carefully groomed,
    and his imperial style mustache/beard
    gives him a debonair almost rakish air.
    His bassoon lays diagonally
    at rest across his thigh
    showing the double reed, bocal, and keywork
    but not the bell.

    It is the work of a skilled photographer,

    Instantaneous Portraits
    Falk
    496 George St. Sydney.

    Australia.


    There are countless cabinet card portraits from the 1880s and 1890s of gentlemen with impressive hair styles. But very few of those men also played bassoon. And even fewer lived in Sydney, Australia's largest city. This musician's photo wins the trifecta of exceptional qualities for a collectable photograph.

    In the 19th century Australia did have very fine photographers in the big cities like Sydney, Melbourne, Adelaide, and Perth, and  vintage Australian photographs can be found on the American antique market, though not in any great number, usually dozens rather than hundreds. This musician's portrait was taken by Melbourne-born photographer Henry Walter Barnett (1862-1934) who trained in London. In 1887 he returned to Australia and opened Falk Studios on George Street, Sydney where he became renown for his photo artistry, and expensive fees. Ten years later Barnett left for London again to establish an upscale portrait studio at Hyde Park Corner where his customers included the royal family and prominent members of English society. With Barnett's studio work so well documented, it seems safe to date this gentleman's cabinet card in the decade from 1887 to 1897. Yet clearly he paid handsomely for a quality photograph from a leading Sydney photographer.

    There is no marking on the photo's back. No studio imprint, no name or date. If the man played the violin or cornet it would not be an unusual photo, but it is his bassoon, the bass instrument of the woodwind family, and the fact that he is in Australia that makes this a remarkably rare vintage photo. Australia is a very big place, but in the 1890s its population was proportionately very small, and well-dressed bassoonists could only be a very, very small fraction of that number.

    So how many bassoonists got their name in an Australian newspaper?

    In the 1890s? 
    Not surprisingly, very few.
    But curiously one bassoonist
    was mentioned more often than expected.


    Sydney Morning Herald
    10 October 1891

    In October 1891 the Sydney Morning Herald ran an advertisement for a Grand Invitation Matinee Concert given by Signor Angelo Casiraghi, cerrtified teacher of Violin and Harmony from the Conservatoire of Music, Leipzig. The afternoon concert included violin solos by Signor Casiraghi, several vocal numbers, a few works for orchestra, organ and harp, and two bassoon solos. performed by Mr. Phil Langdale (late Soloist of the Cowan Orchestra). The titles, "Lucie Long" and "Carnival de Venise" were arrangements made by Mr. Langdale of popular tunes set with variations.

    The National Library of Australia is a wonderful historic archive with a free searchable newspaper database. Between 1888 and 1896 there were over 225 citations of "Phil Langdale, bassoon". Even for a noted violinist or pianist of this era this would be an exceptional amount of newspaper coverage.

    But Mr. Langdale played the bassoon.




    Melbourne Argus
    16 October 1888

    The reference to "late Soloist of the Cowan (sic) Orchestra" was to the orchestra employed for the 1888 Melbourne Centennial Exhibition. This event was organized to celebrate a century of European settlement in Australia. It was held at the Royal Exhibition Building which was built for the Melbourne International Exhibition of 1880–81. For this earlier world's fair the western nave of the main building had a specially built orchestral platform complete with a grand pipe organ, and enough choir tiers for 700 to 750 voices.

    Event organizers for the 1888 Centennial Exhibition anticipated that this concert feature would be a major attraction, so in 1888 they engaged the services of Frederic H. Cowen (1852-1935), a well-known British pianist, conductor, and composer. In 1888 he had just been made conductor of the Philharmonic Society of London, succeeding the famous composer Arthur Sullivan. His fee to go to Melbourne for the Centennial Fair was £5,000, an amount considered at the time especially extravagant for any musician. His terms included the hiring of 15 principal musicians from Britain for the Exhibition Orchestra. One of those musicians was the bassoonist Phil Langdale.

    On the 15th October 1888 a smaller group of the orchestra presented an afternoon recital of solo pieces. On the program was an Air, with variations for bassoon, by F. Godfrey and played by Mr. Phil Langdale. Most of these fine solo performances were re-demanded and repeated, and the whole musical performance was found to be full of interest.

    * * *








    Melbourne Australasian
    4 August 1888


    The Centennial International Exhibition opened in Melbourne on 1 August 1888 and continued to 31 January 1889. Frederic Cowen's exhibition orchestra numbered 73 musicians, including Signor A. Casiraghi in the first violins and P. Langdale, principal bassoon.


    Orchestra musicians roster for
    the Centennial International Exhibition Melbourne: 1888-1889
    Source: Official Record 


    The 15 principals imported from Britain with Mr. Cowen were paid £10 per week. The exhibition commission also agreed to defray the cost of a second-class ticket for the steamship voyage to Australia and a return ticket, if desired. In 1888 the estimated travel time from London to Sydney was 50 days. The remainder of the orchestra was hired from musicians resident in the Australian Colonies. Their salaries varied from £3 10s to £12 per week. The 708 men and women in the Exhibition Choir sang for gratis - without pay, though they got free passes into the Exhibition.

    Orchestra musicians' pay rate for
    the Centennial International Exhibition Melbourne: 1888-1889
    Source: Official Record 



    The Exhibition ran for a bit over 26 weeks during Australia's spring and summer seasons. Over the course of the festival the orchestra and choir performed for 211 Orchestral, 30 Grand Choral, and 22 Popular concerts under Mr. Cowen's direction. This is in addition to many vocal, piano and instrumental recitals, and countless concerts of military bands that provided music throughout the rest of the exhibition area and amusement park.




    Concert hall and grand organ for
    the Centennial International Exhibition Melbourne: 1888-1889
    Source: Official Record 

    Among the Grand Choral works were two performances of Beethoven's Choral 9th Symphony; four of Händel's "Messiah" oratorio; two of Haydn's "Creation" oratorio; four of Mendelssohn's "Elijah" oratorio; two of Rossini's "Stabat Mater"; and  twelve performances of Cowen's choral music, his "Ruth" oratorio, "Song of Thanksgiving", and "Sleeping Beauty" cantata.



    List of choral works performed at
    the Centennial International Exhibition Melbourne: 1888-1889
    Source: Official Record 


    The orchestral concerts included the remaining eight Beethoven symphonies; Berlioz's Symphonie Fantastique; Brahms' Symphony No. 3; Liszt's "Les Preludes"; Mendelssohn's Sym. No. 3 "Scotch"(sic), Sym. No.4 "Italian", and Sym. No. 4 "Reformation" Symphonies; two Schuber symphonies; and three Schumann symphonies. Nearly all were performed more than once. Beethoven's 6th Symphony the "Pastorale" was played five times. The programs also included an astonishing number of overtures, 91 opera overtures including nearly all of those by Beethoven, Mendelssohn, Rossini, Schubert, Wagner, and Weber. There were also a few violin concertos and several piano concertos, along with numerous incidental pieces, opera selections, songs, ballets, marches, rhapsodies, ballads, and serenades. 

    Quite a lot of this music was new and unfamiliar to both musicians and Melbourne's audience. For example Brahm's 3rd Symphony only had its premiere in December 1883. Over 50 musical works programed on concerts at the 1888 Centennial International Exhibition were first performances for Melbourne and probably for Australia too.  

    Concerts were scheduled twice a day at 3:00 pm and 8:00 pm, six days a week except on Sunday. Presumably mornings were reserved for rehearsals. That's roughly 7 to 8 hours of music making each day, or 36 to 48 hours a week not counting individual practice time. In comparison, modern orchestra musicians typically work a 20 to 24 hour week.

    List of orchestral works performed at
    the Centennial International Exhibition Melbourne: 1888-1889
    Source: Official Record 

    The Melbourne Exhibition Hall was modified to seat 2,500 people. Over the six months that the exhibition was open,  an average of 1,915 tickets were sold for each concert, making a total attendance of 467,2999. Of course, there were many other non-musical activities and sights for the public to see at the Melbourne exhibition park. but the musical arts were the chief attraction. It made for a daunting, if not exhausting, marathon list of music for any musician. For bassoonist Phil Langdale it meant easily a half dozen difficult bassoon solos to master each day. Only a well trained musician could survive that level of intense music. Someone who knew how to wield a bassoon as a defensive weapon if the music so demanded.

    Someone who had been a member
    of Her Majesty's Cold Stream Guards Band.


    Dublin Irish Times
    13 April 1875

    Philip Langdale was just 20 years old in 1875 when he performed a Bassoon Solo (with variations) in Dublin's Exhibition Palace as a member of the Band of the Coldstream Guards. He was born in 1855 in Sevenoaks, Kent and probably joined the band at around age 16. His instrument, the bassoon, had long held a place in military bands, providing a sonorous bass voice that was also capable of great musical agility. 

    The Coldstream Guards Band had a long musical tradition that dated back to 1785, and it held a reputation as one of the best in the British Army, which had a great number of military bands. This band provided music for any ceremonial duties to Queen Victoria, as well as for other military events. But by the 1870s, military bands also were an important unit for the British government's public relations, traveling the country performing at innumerable flower shows, exhibitions, and civic affairs. Between 1873 and 1881 there were over a hundred newspaper references to Mr. Langdale's bassoon solos (with variations) at concerts by the Band of the Coldstream Guards. The band's programs were regularly published and Langdale's bassoon received much praise in the reviews. The music that the band played included an immense number of popular overtures, songs, and solo instrumental works arranged for wind band from orchestral scores, as well as the standard military marches. This disciplined musical training would have given young Philip Langdale a good grounding in all the current styles of European music.

    After 1881 his name appears less often as he seems to have left the Coldstream band for civilian life. In July 1883, Mr. P. Langdale appeared at London's Adelphi Theatre playing a bassoon solo "Lucy Long".  In February 1885 another Langdale bassoon solo was advertised by Her Majesty's Theatre where an orchestra of 100, assisted by the Band of the London Rifle Brigade, played a concert of various opera overtures, solo vocal pieces, dances, and a Descriptive Fantasia: "A Voyage in a Troop Ship."  In July 1885, Mr. Langdale demonstrated a bassoon made of ebonite, a man-made material, at a musical instrument exhibit of the Rudall, Carte, and Co.  

    But the only thing that this research proves
    is once upon a time a talented British bassoonist
    could boast of a surprising prestige
    on the Victorian era concert stage.

    It doesn't convincingly establish that
    the bassoonist with the wonderful curled mustache
    is Mr. Phil Langdale, the late bassoon soloist
    of the Melbourne Exhibition Orchestra.


    If only there was another photo.




    * * *




    The 1888 Melbourne Centennial Exhibition was an international exposition attracting elaborate displays from all around the world as well as Australia. Thousands of representatives of industry, trade, and the arts booked space at the exhibition to demonstrate their newest and best products. The planning also required hundreds of contractors and staff to operate the fair's activities. Concerned about maintaining security the Melbourne Exhibition Commission decided to have individual photo portraits compiled of all persons employed at the exhibition. Many of these identity photos survive in the archives of the State Library of Victoria.

    The musicians of the Melbourne Centennial International Exhibition Orchestra worked 6 days a week through the entire event, so of course, they were photographed too. The State Library of Victoria has a souvenir collage of the orchestra with 68 musicians' ID photos surrounding a photo of their music director, Frederic H. Cowen. There are no instruments and no names, but the archive offered a high definition image to download.



    1888 Centennial International Exhibition Orchestra
    Paterson Bros., photographers
    Source: State Library Victoria Archives


    The musicians' photos, all men of course, illustrate the amazing variety of mustaches, beards, and hair styles that were the male fashion of the 1880s. This era might better be called the golden age of barbers. 




    The faces of many men were easily eliminated as too old, too, fat, etc. But a few grainy images made promising matches. These two men, center row, 2nd and 3rd from right bear a good resemblance to my bassoonist, and the one on the left has a similar impressively long mustache.








    This man, third row from bottom, 2nd from left, has a similar imperial style beard and a receding hair line.






    But the man pictured on the bottom row, 4th from right, made the best match to my bassoonist.
    His mustache may lack the twirled extensions but it has the same shape.
    I think his hooded eyes, high forehead, thin hair, and cheekbones
      makes him a ringer for the man in my photograph.
    The two men also share an inclination for rumpled suit coats.








    The bassoonist Philip Langdale declined the Melbourne Exhibition Commission's offer of a steamship ticket to return to England, and instead stayed in Melbourne working as a professional musician. He played bassoon solos in Sydney, Brisbane, Adelaide, and even New Zealand that were commended in reviews for their wit and musical facility. Then as now, the sound of the bassoon is associated with musical humor, even though it is very capable of producing many other profound and beautiful emotions.

    But as time passed the Australian audience's acclaim was not enough to meet a musician's financial challenges. By 1894 Langdale was evidently struggling to keep afloat in show business and hinting the he would soon leave for Britain.


    Melbourne Table Talk
    23 March 1894







    Mr. Phil Langdale's "benefit" concert on Thursday night last, at the town hall, drew a fairly large and, as his many good qualities deserve, a sympathetic audience. Mr. Langdale has, ever since his first appearance here with Mr. Cowen been consistently a public favorite, and this quite as much on account of his amiable disposition and the ready sympathy he has always shown to his fellow artists, as on account of his mastery over his instrument. 

    He had certainly no reason to complain of the warmth of the greeting offered to him when he first appeared upon the platform, nor of the applause that followed his first solo, the "Carnival de Venice." And of the floral tributes offered to him nothing could have been more appropriate than the one woven in the form of a bassoon. The warmth of feeling shown him should be a guarantee to Mr. Langdale that he bears with him the best wishes of his friends and admirers. 

    But the programme was inordinately long, and was not, on the whole as readily carried out as usual. Apart from the performance by Mr. Langdale, who was naturally the central figure of the evening...

    * * *


    Langdale managed another year in Australia
    before finally making his farewell concert in 1895.
    A Melbourne wag wrote an amusing tongue-in-cheek tribute
    that says a lot about Langdale and the friendships he made in Australia.


    Melbourne Punch
    11 July 1895









    Mr. Phil Langdale, the eminent bassoon player, who is leaving the colony for England almost immediately, is doing so in consequence of the small demand for bassoon playing in this country. He attributes this lack of interest in the instrument to the political management of the colony. It would be worth while for him to clearly explain what sort of political administration ought to prevail in order to make bassoon playing popular and profitable. 

    What is there that is anti-bassoonical in our present politics? Wagner, if we remember rightly, called the bassoon the "clown of the orchestra," on account of its appropriateness for producing comic effects. There are so many clowns in politics that we should have expected them to take a fraternal interest in the instrument, if they had any inclination to interested in any instrument of music whatever. 

    We are, however, really sorry Phil Langdale is going, and hope that the state of politics, of which he complains will bas-soon altered.

    * * *







    This photo detective has tried to connect an unmarked portrait of a musician with a name that has no likeness, but regrettably it is not conclusive proof of identification. However, circumstantial evidence sometimes is sufficient too. So I'm convinced that a musician like Phil Langdale, whose talent on the bassoon was so frequently recognized during his years in Australia and whose wit and charm had endeared him to many friends, would very likely invest in a handsome photograph like this as a gift for his admirers. It the sort of thing one does when taking leave of a place and setting out on a long voyage to a distant land.






    * * *
    CODA

    The following year, December 1896, Phil Langdale was on stage in London as a bassoon soloist with the Inns of Court Orchestral Society. His name appears much less frequently than when he was with the Coldstream Guards Band, probably because he was working in theater orchestras and seaside pier bands. In around 1900 he begins touring England with the "London Wind Quintette", an early instance of a professional wind chamber group. During the war years his novelty bassoon solos were occasionally worthy of note in newspaper reviews. The last mention of his name was in 1921 as principal bassoon of the Tonbridge Orchestral Society.

    I've left out his family history mainly because it was never mentioned in the Australian newspapers and is not pertinent to my case. However I have documented his name in the UK census and other records and know that Phil Langdale, born in 1855, married Selina Campbell, age 19, in 1885. Whether she accompanied him to Australia, I do not known. They had two daughters, Nina, born in 1887 and Phyllis, born in 1903.
    Philip Langdale, bassoonist, died on 22 October 1929 at age 74.



    Curiously his name appeared
    in the 1933 U.S. official catalogue of copyright entries
    for a bassoon solo with pf. acc. (pianoforte accompaniment)
    It was entitled
    We won't go home till morning;
    by Phil Langdale;
    ©Feb. 7, 1933 by Hawkes & son (London) ltd.


    1933 United States Catalogue of Copyright Entries





    As a special musical homage
    for his story

    let's listen to a rendition
    of one of Phil Langdale's favorite bassoon variations.

    This video comes from a March 25, 2012 concert
    at Edinborough Park, in Edina, Minnesota
    featuring Alex Legeros on bassoon

    with the Edina Sousa Band, playing "Lucy Long."

    ***


    ***









    This is my contribution to Sepia Saturday
    where the batter is up and the basses are loaded.


    http://sepiasaturday.blogspot.com/2017/05/sepia-saturday-369-27-may-2017.html













              Accounting Assistant (Part-time) - Southern Current LLC - Charleston, SC   
    Assist controller with various accounting and HR duties. 12-20 hours per week. Accounts payable duties to include posting invoices, maintaining vendor database,... $12 - $15 an hour
    From Indeed - Fri, 17 Mar 2017 19:21:27 GMT - View all Charleston, SC jobs
              Web Developer - NIKSUN, Inc. - Princeton, NJ   
    Minimum 2+ years with PHP development. UNIX, PHP, MySQL database, Apache server, Content management....
    From NIKSUN, Inc. - Wed, 07 Jun 2017 05:33:58 GMT - View all Princeton, NJ jobs
              Database Administrator - Spearhead Staffing - Bridgewater, NJ   
    Minimum B achelor's of Science in Computer Science or a Technology field. Reviews description of changes to database design to understand how changes affect the...
    From Spearhead Staffing - Wed, 28 Jun 2017 20:20:39 GMT - View all Bridgewater, NJ jobs
              Push Partner Registry   
    State

    The Push Partner Registry (PPR) is a five-county public health partnership with community-based organizations to dispense medications to at-risk populations during an emergency. The intent of the registry is to create a comprehensive database of regional private partners and community-based organizations that serve at-risk populations and are willing to serve as a private point-of-dispensing (POD) site during an emergency. Participating organizations receive several benefits from their involvement, including the opportunity to offer the medication to their clients, along with staff members and their families. The initiative may assist the entire community by reducing the number of citizens seeking medications at public dispensing sites.

    Materials included as part of this practice include a guide to distribute to potential organizational partners, a sample database, a word document containing suggestions for printing and reproducing the kit materials, and answers to frequently asked questions. The "PPR Guide to Obtaining and Dispensing Medications to your Employees and Vulnerable Clients" focuses mainly on dispensing doxycycline during an anthrax emergency, but it could easily be adapted to other public health emergencies. The guide includes several materials to aid public health leaders in recruiting organizational partners and establishing a POD, including a cover letter, an enrollment form, a dispensing plan template, job action sheets, a sample POD intake form and flow chart, and a resource and reference list. The materials were developed at an especially opportune time, when many public health agencies are currently investigating alternate dispensing mode methods to reach vulnerable populations.


              Collecting medical countermeasure data   
    State

    Illinois Department of Health developed an online survey tool to collect medical countermeasure data, as required by CDC for the countermeasures report. Based on CDC's Medical Countermeasure Situational Report form, the survey tracks both those supplies the local health departments and hospitals have in stock and those supplies they have already distributed. In addition to meeting CDC's requirements, the resulting data provides timely information on the situation in the field; including identifying the items that are most used, as well as possible shortage locations. This allows Illinois' Office of Preparedness and Response to make timely decisions on when to push out its SNS supply during the response effort.

    The survey is sent weekly to all of the state's local health departments and 156 hospitals that participate in the Hospital Preparedness Program (HPP). To engage the participating agencies in this new process, teleconferences were held and detailed emails were sent. While compliance has not been 100%, the response has been very good.

    Overall, this has been a very cost-effective project, requiring only limited staff time and a SurveyMonkey account. Illinois assigned an Americorp intern to keep the survey updated and to send it to the participating agencies weekly. Upon reply, the IT department downloads the results into a Microsoft Database and prints a report of potential shortages for the week. All perceived shortages are then verified with either the Local Health Department's Emergency Response Coordinator or the hospital's Preparedness Program Coordinator. Once the shortage is confirmed, the agency would then make an official request to the state.

    Illinois has used the survey results to ascertain needs regarding: pre-positioning more antivirals and Personal Protective Equipment (PPE) at the local level, knowing on-hand amounts of antivirals, determining distribution, and tracking product expirations. Illinois' previous method for collecting medical countermeasure data was based on a 48-hour Anthrax response. A pandemic situation and the need to pre-position pro rata allocations of supplies, however, means that a longer time period for response is acceptable. By developing the two plans, Illinois has the flexibility to use either plan or even something in-between based on the emergency at hand.

    Tools
    Survey
    Download
    pdf, 147 KB

              C# Guru munkakörbe keresünk munkatársat. | Feladatok: Developing a web based GPS tracking platf...   
    C# Guru munkakörbe keresünk munkatársat. | Feladatok: Developing a web based GPS tracking platform with real time viewCreating Android and iOS apps for GPS Tracking and Internet of ThingsAs a developer you are involved in the whole development cycleWorking in small teams with a chance to influence the choice of tools and technologiesBesides software development you are involved in projects as a technical consultantYou are responsible for clean coding and work in small group of very experienced developers. | Mit ajánlunk: Workplace in the heart of the city in an office with 21st century solutionsTo work for an easy-going, non-bureaucratic environmentTo be member of a youth, enthusiastic group, where your ideas really matterOpportunity to travel abroad ? one month training in SwedenExcellent salary and benefits | Elvárások: Degree in Computer Science, EngineeringFluent English knowledgeExperience in developing high quality, complex software productsExcellent knowledge of C# and .NETExperience with MySQL and MS SQL databasesKnowledge of Javascript, HTML, CSS are a huge advantageHigh motivation, flexibility and ability to work independentlyGood planning/ organizational skills and techniques | További infó és jelentkezés itt: www.profession.hu/allas/1040331
              Customer Reference Program Intern - VOC0002KH munkakörbe keresünk munkatársat. | Feladatok: T...   
    Customer Reference Program Intern - VOC0002KH munkakörbe keresünk munkatársat. | Feladatok: Taking part in the customer advocacy project launched by the marketing team Contacting our external clients via e-mail in order to involve them in the project Extending with accuracy the reference customer database with core contact details Collecting the results of net promoter score survey based on the partner ?s feedback Helping the marketing department in other project related and ad-hoc tasks. | Mit ajánlunk: Flexible work schedule with 20- 30 hours/week Six-month long Internship position which helps you to gain knowledge about the processes of a multinational company Diverse tasks regarding process improvement where you can share and reveal your ideas Chance to gain experience in different fields of marketing in a very proactive and creative team Further career opportunities in the field of marketing and other departments | Elvárások: Have a full time active university semester now Are self-confident in Hungarian and English communication as well Are prepared to stay in touch constantly with our external customers mainly via e-mail Are ready to assist commercial functions with reference customers for the project Would like to get a better insight of marketing activities in multicultural environment | További infó és jelentkezés itt: www.profession.hu/allas/1040603
              German Speaking Financial Operations Associate ? Real Estate - 000000184763 munkakörbe keresünk m...   
    German Speaking Financial Operations Associate ? Real Estate - 000000184763 munkakörbe keresünk munkatársat. | Feladatok: Validate, post and book the landlord invoices in SAP • Create the new contracts in the SAP database and capture all contractual terms • Register, block and send Real Estate contracts to the Local Market for approval • Monitor contracts, renew and maintain contracts • Issue reports on status to the relevant stakeholders • Ensure resolution of external customer queries • Build strong relationship with external Business partners. | Mit ajánlunk: We provide work assets as laptop and mobile phone with Vodafone RED subscription • Unravel your continuous process improvement mindset ? new ideas are always listened to • Internal coaching/mentoring culture • Support of career aspirations and personal development • Possibility to work from home one day weekly | Elvárások: Fluent German and English language skills • MS Office skills, SAP experience is an advantage • Some experience gained in a similar position would be an advantage • Qualification in Finance is an advantage • Outstanding analytical and numerical skills • Good problem solving and interpersonal skills | További infó és jelentkezés itt: www.profession.hu/allas/1040020
              Global data management analyst munkakörbe keresünk munkatársat. | Feladatok: Periodically Downl...   
    Global data management analyst munkakörbe keresünk munkatársat. | Feladatok: Periodically Download per Region and Analysis of interregional Material Flows • Analysis of Customer Requests/ Proposal for Tool / Equipment localization • Detailed Analysis on Part Number and Tool Level • Lead all Localisation activities • Definition of potential Part numbers and Tools and decision making related to localization • Definition of possible receiving site with Global Capacity / Business Placement Team • Providing as much details as possible to receiving site and support on project launch execution • Weekly status review with Regional teams on new projects for decision making • Prepare results for executive Power Point Presentation reviews • Analyse Global OEE / TEEP numbers by Plant and Process • Prepare Database on Utilisation on all Processes for strategic decisions. | Mit ajánlunk: Competitive salary • High level technical background and professional improvement opportunity in an absolutely international environment | Elvárások: Technician, Engineer • Min 4 years experience in Plastics Industry , Logistics • Excellent English skills verbal and written • Advanced Excel capability • High team approach and communication capability • Very good analyse and problem solving skills | További infó és jelentkezés itt: www.profession.hu/allas/1035149
              Web developer / E-Business Application Lifecycle Management Support munkakörbe keresünk munkatárs...   
    Web developer / E-Business Application Lifecycle Management Support munkakörbe keresünk munkatársat. | Feladatok: Ensure seamless functionality of our Magento eshop and our Typo3 website as first and second level support • Understand our deep integration of the eCommerce applications with our SAP and PIM- / DAM-system ?Contentserv? • Support German organization to speed up development of new functions in PHP for Magento / Typo3 • Support UX modernization projects, like single sign on activation / LDAP connection • V&B Ticketing system ?Redmine? ? administration and maintenance, smaller functional enhancements, skills optional • Test organization / user administration. | Mit ajánlunk: competitive compensation package • working in SSC Shared Service Center IT • long-term opportunity • pleasant and modern working environment • good working atmosphere | Elvárások: A Bachelor?s or Master?s degree in Software development or Business • Fluent English • Minimum 2 years of ecommerce and / or software development experience with Magento eshop system and / or Typo3 content-management are an advantage • Knowledge of programming language PHP and MySQL databases • Flexible team player who is working in environments requiring ability to effectively prioritize multiple concurrent projects | További elvárások: Knowledge of SAP, especially Sales & distribution, is an advantage | További infó és jelentkezés itt: www.profession.hu/allas/1040758
              Risk Management Reporting Analyst 4-17042640 munkakörbe keresünk munkatársat. | Feladatok: Job ...   
    Risk Management Reporting Analyst 4-17042640 munkakörbe keresünk munkatársat. | Feladatok: Job Purpose: • • The Business Intelligence Analyst will be a key resource in managing development and support of new tools and reports requirements gathering, developing rapid proofs of concepts, leading stakeholder meetings, working with technology teams, creating specifications, documenting bug fixes and enhancement requests, managing user acceptance testing and responding to ad hoc requests from Senior Management. • • Key Responsibilities: • Participating in and providing support for significant ad-hoc projects and control initiatives • Work in conjunction with senior management on reporting key issues across a diverse set of risk and control metrics • Provide demo?s and walkthroughs of the ICG O&T QlikView dashboard to all levels of the firm • Provide ongoing and ad hoc reporting to senior stakeholders • Manage a monthly release cycle to update the ICG O&T QlikView Dashboard. This includes JIRA management • Develop and implement strategies for effectively managing risk • Knowledge of Citi?s policies and procedures as they relate to risk management • Guide the lifecycle of an operations/business controls solutions ? this with a focus on simplification, elimination, centralization and automation • Gather, write and review business requirements and translating requirements into functional specifications for technology partners • Write and review Operating Manuals and User Guides of the reporting tools offered by the team • Support data mapping between Source Systems, existing reports and the team?s data warehouse • Analyze and understand the data models in the data warehouse and behind the team?s reporting frontends QlikView Dashboards, MicroStrategy • Make recommendations with regard to improvements to program analytics to be used by managing Citi?s risk management strategies • • Development Value: • Risk Management activities in global financial services organization • Knowledge of Citigroup processes and systems • Exposure to corporate strategic initiatives • Working as part of a Global Team • Hands on experience with industry-leading Business Intelligence Tools and Architecture. | Elvárások: Knowledge and Experience: • 5-7 years? experience in Business Intelligence, preferably in controls or reporting role with a heavy focus on automation and reporting improvements. • Understanding of Business Intelligence Architectures • Experience with Business Intelligence or IT Development Projects • Customer-oriented, resourceful, flexible, quick-thinking and enthusiastic. • • Skills: • Understanding of SQL • Knowledge in developing reports, defining Universes and managing distribution using MicroStrategy or Business Objects • Experience with Dashboard development preferably in QlikView • Understanding of data warehousing concepts ETL, SCD • Understanding of relational databases • Ability of understanding data requirements and converting them into logical data models • Ability to work with large quantities of data and convert it into understandable results • • Additional advantage: • Knowledge of the control environment, including Audits, Issues, & CAP management • Knowledge of ICG products and business processes. • Good grasp of Control environment. • • Qualifications: • Higher degree / Bachelor?s Degree • Nice to have: MBA, CPA, PMP, CSM or Lean Six Sigma Certification a plus • • Competencies: • Willingness to ask questions, challenge the process and seek out answers • Flexibility to handle multiple tasks and changing priorities. • Excellent verbal, written and interpersonal communication skills. • Strong team player with excellent project and analytic skills • Ability to interact with Senior Management. Verbally and in written communications • Analytical ability, strong organizational skills and attention to detail • Highly self-motivated with a strong sense of initiative • Nice to have: able to identify efficient ways to address a variety of tasks and complete them in a time efficient manner | További infó és jelentkezés itt: www.profession.hu/allas/1040506
              CSIS EMEA GIA Business Intelligence and Reporting Analyst - 17044260 munkakörbe keresünk munkatár...   
    CSIS EMEA GIA Business Intelligence and Reporting Analyst - 17044260 munkakörbe keresünk munkatársat. | Feladatok: Job Purpose: • • The Business Intelligence Analyst will be a key resource in managing development and support of new tools and reports requirements gathering, developing rapid proofs of concepts, leading stakeholder meetings, working with technology teams, creating specifications, documenting bug fixes and enhancement requests, managing user acceptance testing and responding to ad hoc requests from Senior Management. • • Job Background: • • • ICG O&T Control and Reporting is a strategic risk management organization that is integral to ensuring ICG Operations and Technology have the appropriate controls, oversight and reporting in place across all products, processes and systems to fully protect the Firm. • Within ICG O&T Control and Reporting, the Digital Controls Data Analytics team is responsible for creating reporting and dashboards that synthesize key reporting metrics. These include issue and correction action plan CAP reporting, production access control, losses, end user computing controls and internal audit statistics. • • Key Responsibilities: • • Participating in and providing support for significant ad-hoc projects and control initiatives • Work in conjunction with senior management on reporting key issues across a diverse set of risk and control metrics • Provide demo?s and walkthroughs of the ICG O&T QlikView dashboard to all levels of the firm • Provide ongoing and ad hoc reporting to senior stakeholders • Manage a monthly release cycle to update the ICG O&T QlikView Dashboard. This includes JIRA management • Develop and implement strategies for effectively managing risk • Knowledge of Citi?s policies and procedures as they relate to risk management • Guide the lifecycle of an operations/business controls solutions ? this with a focus on simplification, elimination, centralization and automation • Gather, write and review business requirements and translating requirements into functional specifications for technology partners • Write and review Operating Manuals and User Guides of the reporting tools offered by the team • Support data mapping between Source Systems, existing reports and the team?s data warehouse • Analyze and understand the data models in the data warehouse and behind the team?s reporting frontends QlikView Dashboards, MicroStrategy • Make recommendations with regard to improvements to program analytics to be used by managing Citi?s risk management strategies • • Development Value: • Risk Management activities in global financial services organization • Knowledge of Citigroup processes and systems • Exposure to corporate strategic initiatives • Working as part of a Global Team • Hands on experience with industry-leading Business Intelligence Tools and Architecture. | Elvárások: Knowledge and Experience: • 5-7 years? experience in Business Intelligence, preferably in controls or reporting role with a heavy focus on automation and reporting improvements. • Understanding of Business Intelligence Architectures • Experience with Business Intelligence or IT Development Projects • Customer-oriented, resourceful, flexible, quick-thinking and enthusiastic. • • Skills: • Understanding of SQL • Knowledge in developing reports, defining Universes and managing distribution using MicroStrategy or Business Objects • Experience with Dashboard development preferably in QlikView • Understanding of data warehousing concepts ETL, SCD • Understanding of relational databases • Ability of understanding data requirements and converting them into logical data models • Ability to work with large quantities of data and convert it into understandable results • • Additional advantage: • Knowledge of the control environment, including Audits, Issues, & CAP management • Knowledge of ICG products and business processes. • Good grasp of Control environment. • • Qualifications: • • Higher degree / Bachelor?s Degree • Nice to have: MBA, CPA, PMP, CSM or Lean Six Sigma Certification a plus • • Competencies: • • Willingness to ask questions, challenge the process and seek out answers • Flexibility to handle multiple tasks and changing priorities. • Excellent verbal, written and interpersonal communication skills. • Strong team player with excellent project and analytic skills • Ability to interact with Senior Management. Verbally and in written communications • Analytical ability, strong organizational skills and attention to detail • Highly self-motivated with a strong sense of initiative • Nice to have: able to identify efficient ways to address a variety of tasks and complete them in a time efficient manner | További infó és jelentkezés itt: www.profession.hu/allas/1040504
              ARIS System Specialist - 000000175278 munkakörbe keresünk munkatársat. | Feladatok: Manage the ...   
    ARIS System Specialist - 000000175278 munkakörbe keresünk munkatársat. | Feladatok: Manage the core process method, including attributes, symbols, naming, new modelling objects if requested etc. • Monitoring the Core Business Model via running and analysing semantic and structure checks to detect possible hiccups in the model structure • Manage the process filters, templates and user groups • Maintain clear boundaries between attributes for user groups, managing master symbols in the core and the settings of model types • Own and update the complete technical library of the CBM Core Business Model • Managing the CBM reporting packages by evaluating user requests, designing the reports and run them • Manage the integrity and coherence of the ARIS libraries SAP Transaction codes, Positions, and Document types in close co-operations with the key Process Leads and the technical teams • Creating extracts from ARIS to enable the Knowledge Portal interface efficiency • Think on possible solutions for automation • Make regular updates on Aris progress to Process Governance Team • Manage EVO Knowledge Portal updates as backup support. | Mit ajánlunk: We provide working assets as laptop and mobile phone with Vodafone RED subscription • Unravel your continuous process improvement mind-set ? new ideas are always listened to • Internal coaching/mentoring culture • Internal career opportunities • Possibility to work from home | Elvárások: Min. 3 years? experience is essential within a process modelling environment where clear understanding of business models, quality standards, business process and design can be demonstrated • ARIS Certified Business Process Administrator • ARIS experience or BPM tool database management or equivalent • Excellent English communication skills written and verbal • Work accurate, focussed and attention to detail • Ability to build effective working relationships with an international and geographically disparate team • Ability to manage in a dynamic, high growth and uncertain environment • Capable of lateral thinking | További infó és jelentkezés itt: www.profession.hu/allas/1041140
              Reference Data Analyst Intern part time - Budapest munkakörbe keresünk munkatársat. | Feladatok...   
    Reference Data Analyst Intern part time - Budapest munkakörbe keresünk munkatársat. | Feladatok: Research and acquire in depth knowledge of the market data content models of upstream data suppliers to MSCI for multiple financial asset classes Learn and apply business logic required to normalize set of terms and conditions data models Work closely with Senior Terms and Conditions Analysts in the implementation and maintenance of these business rules Monitor quality of terms and conditions data on a daily basis Understand the use of MSCI Risk Management Software and workflow of Terms and Conditions Data Understand new reference data requirements and work with Business and Development Groups. | Elvárások: Pursuing University program with focus on Finance, Economics, Engineering, Mathematics, Statistics or another quantitative field with strong knowledge of finance Demonstrated interest in financial markets, products and risk management Interest in market data content management ? specifically in any or all of equities, equity derivatives, domestic and international mutual funds, fixed-income govt./corporate debt, structured debt, commodity futures, commodity derivatives and credit derivatives Interest working in environment that combines finance and technology Programming experience in Visual Basic, Perl and C/C++ is a plus Knowledge of SQL and relational databases, preferably Oracle is a plus Must be willing to work in a global team environment and able to move things forward via strong communication skills Strong problem-solving skills, attention to detail Possibility to work part time 20-30 hours a week | További infó és jelentkezés itt: www.profession.hu/allas/1040675
              Multilingual HR Operations Associate munkakörbe keresünk munkatársat. | Feladatok: Administrati...   
    Multilingual HR Operations Associate munkakörbe keresünk munkatársat. | Feladatok: Administration 90%: • Process HR transactions in HR systems • Maintaining HR Oracle database, and support audit to ensure highest standards of completeness, accuracy and compliance with relevant procedures • Consistently follow defined HR processes, challenging steps that may not make sense / add value • Process documentation in accordance the possible changes together with process and system experts • Ensure that high standards of accuracy and quality are maintained with appropriate controls in place • HR and regional Global Operations HR Partnership 10%: • Develop a good working relationship with the GE HR community through timely and accurate administration of HR processes, ensuring effective communication and early identification of requirements and any service issues • Proactively phone regional HR partners and customers to ensure timely resolution of transactions and manage expectations • Open to partner feedback and use it to improve our service offering • Deliver on commitments, manage expectations & keep partners informed on progress ? taking clear accountability and ownership throughout the process. | Elvárások: Administrative experience in a preferably similar environment • Ability to priorities multiple tasks and work to deadlines • Comfortable delivering against quantitative and qualitative performance metrics • Excellent attention to detail • Proven ability to work professionally and proactively with a remote client base • Fluency in English and one other European language German/French/Spanish/Dutch/Swedish/Norwegian/Italian/Polish/Portuguese • Ability to anticipate and resolve challenges • Working experience in using of Microsoft Office • Ability to manage sensitive data • Supportive and enthusiastic team player • Successful applicant will be legally eligible to enter into an employment relationship under the laws of Hungary | További elvárások: Shared Service experience HR • Success in a highly professional Customer Operations or HR Administration role, ideally within a multi-national organization • Strong written knowledge of European languages other than English • Working experience in using of Oracle • SSCHU | További infó és jelentkezés itt: www.profession.hu/allas/1039766
              arabidpanda on "Installing new wordpress theme remotely"   

    I apologize if this seems like a silly question. However, my boss wants to switch his wordpress theme (via my insistence and assistance). However, he wants me to do it. Cool no problem, 2012 I can figure it out. I however have never run a wordpress site that wasn't a sitename.wordpress.com site.
    Here are my questions:
    1. Are all files installed via web or do I have to access the main frame computer?

    2. Can we edit a theme from two different computer? Meaning can I link up his wordpress database to my computer. Or is this unnecessary b/c it's all installed online?

    Basically, I am not 100% clear on how a wordpress site works. I know that on dreamweaver you can connect multiple computers, word offline, then upload to the site. But wordpress isn't my strong suit.

    HELP! Thanks so much for your time and sorry for the silly question.


              IT Supervisor Hungary & Romania munkakörbe keresünk munkatársat. | Feladatok: Coordinate the da...   
    IT Supervisor Hungary & Romania munkakörbe keresünk munkatársat. | Feladatok: Coordinate the daily activities of local IT team within the areas of infrastructure and application support • Interface to customers/users • Work with local business management to fulfil requirements by successful delivery of IT services • Ensure compliance with standards, procedures and policies security and further Shared Services and Sapa policies • Participate in IT change activities/projects • Administer local infrastructure components • Resolution of Client issues • Work with global helpdesk to resolve infrastructure issues • Supervise service delivery • Supervise contracts by consulting with Purchasing department • Management of local vendors. | Mit ajánlunk: Excellent remuneration package 13th of the month salary payments, supplementary benefits, Christmas benefits, company pension scheme membership and annual bonus • Commitment to professional training and development, challenging and rewarding tasks, opportunities for career advancement in a rapidly growing multinational organization | Elvárások: 5 years of work experience in similar field of IT services IT support including application and infrastructure • BSc. in Information Technology or equivalent • Detailed understanding of MS Client & Server Technologies • Basic overview and experience in application/database development requirement and process control • Fluency in English verbal and written • Ability to work in an international team • Good interpersonal skills and experience in people management • ITIL knowledge is preferred • Experience in a manufacturing environment is preferred | További infó és jelentkezés itt: www.profession.hu/allas/1040283
              How to Hack an Election in 7 Minutes - POLITICO Magazine   
    When Princeton professor Andrew Appel decided to hack into a voting machine, he didn’t try to mimic the Russian attackers who hacked into the Democratic National Committee's database last month. He didn’t write malicious code, or linger near a polling place where the  …
              Programmer Analyst - Montgomery County, OH - Ohio   
    The candidate should have experience with Oracle and/or SQL Server databases, HTML, .NET and/or ColdFusion and a working knowledge of SQL coding.... $45,000 - $99,000 a year
    From Montgomery County, OH - Sun, 16 Apr 2017 08:20:07 GMT - View all Ohio jobs
              By: Erik   
    2.1 does seem to crash less often in the Development module on my PC. (Which is greatly appreaciated) it is still slow though, compared to all of the other apps on my machine.Lightroom version: 2.1 RC1 [508271 Beta 1]Operating system: Windows Vista Home Edition Service Pack 1 (Build 6001)Version: 6.0 [6001]Application architecture: x86System architecture: x86Physical processor count: 2Processor speed: 2.4 GHzBuilt-in memory: 3581.6 MBReal memory available to Lightroom: 1228.8 MBReal memory used by Lightroom: 374.1 MB (30.4%)Virtual memory used by Lightroom: 361.8 MBMemory cache size: 36 MBSerial Number: 116040099493498636733348Application folder: C:\Program Files\Adobe\Adobe Photoshop Lightroom 2Library Path: C:\Users\Administrator\Documents\Lightroom Backups\2008-08-13 2109\Lightroom Database-2.lrcat
              Staff Accountant - (Springfield)   
    Staff Accountant Western New England University is seeking a full-time Staff Accountant for the Controller's Office whose primary purpose will be to maintain the position control data base and provide general accounting support for the Controller's office during the implementation of new campus wide software programs. This position is funded by a temporary budget. Major responsibilities will include: maintenance of position control database, preparation of budget adjustments, preparation of budget advisory meeting documents and various finance committee meeting notes as well as maintenance of the Controller's office webpage.
              Accounts Payable Analyst/Treasury - (Boston)   
    Title Accounts Payable Analyst/Treasury Description * Processing of accounts payable invoices to ensure that vendor payables are in accordance to contract terms and within predetermined performance measurementso Entry and distribution of invoices to appropriate business owner for approval o Process approved invoices prepare batches & submit to Supervisoro Provide reporting for vendor payable trendso Collaborate with General Ledger to ensure accurate account posting * Manage Associate Expense reportso Ensure auditing of reports for compliance to company guidelineso Coordinate with Associate and/or Department managers to resolve discrepancies* Vendor Administrationo Maintain current vendor database according to company guidelineso Process and Track Daily Vendor Add / Change Requestso Conduct Bi monthly Vendor Due Diligence screening* Support Treasury Analyst Functionso Support Treasury Analyst in operational functionso Distribute daily cash reportingo Train and support weekly payable electronic fund transmission /check printo Manage and distribute Petty cash, complete month end reconciliation.o Assist with Month End Close Procedures, and other department responsibilities as required Requirements Qualifications:* Associates Degree in Accounting/Finance* 2/3 years of relevant Treasury/Banking experience (international preferred)* Previous experience in accounts payable and customer relations is preferred.* Analytical skills with knowledge of spreadsheet applications.* Highly organized, detail oriented and good problem solving skills.* Self motivated: Able to work independently and as a team member* Strong interpersonal, written and verbal communication skills.* Ability to react in a fast paced and changing environment while not losing focus on priorities Source: http://www.juju.com/jad/000000009pbtot?partnerid=af0e5911314cbc501beebaca7889739d&exported=True&hosted_timestamp=0042a345f27ac5dc0413802e189be385daf54a16310431f6ff8f92f7af39df48
              Accounts Payable Analyst/Treasury - (Boston)   
    Title Accounts Payable Analyst/Treasury Description a€ cents Processing of accounts payable invoices to ensure that vendor payables are in accordance to contract terms and within predetermined performance measurementso Entry and distribution of invoices to appropriate business owner for approval o Process approved invoices prepare batches & submit to Supervisoro Provide reporting for vendor payable trendso Collaborate with General Ledger to ensure accurate account posting a€ cents Manage Associate Expense reportso Ensure auditing of reports for compliance to company guidelineso Coordinate with Associate and/or Department managers to resolve discrepanciesa€ cents Vendor Administrationo Maintain current vendor database according to company guidelineso Process and Track Daily Vendor Add / Change Requestso Conduct Bi monthly Vendor Due Diligence screeninga€ cents Support Treasury Analyst Functionso Support Treasury Analyst in operational functionso Distribute daily cash reportingo Train and support weekly payable electronic fund transmission /check printo Manage and distribute Petty cash, complete month end reconciliation.o Assist with Month End Close Procedures, and other department responsibilities as required Requirements Qualifications:a€ cents Associates Degree in Accounting/Financea€ cents 2/3 years of relevant Treasury/Banking experience (international preferred)a€ cents Previous experience in accounts payable and customer relations is preferred.a€ cents Analytical skills with knowledge of spreadsheet applications.a€ cents Highly organized, detail oriented and good problem solving skills.a€ cents Self motivated: Able to work independently and as a team membera€ cents Strong interpersonal, written and verbal communication skills.a€ cents Ability to react in a fast paced and changing environment while not losing focus on priorities . Source: http://www.juju.com/jad/000000009ft191?partnerid=af0e5911314cbc501beebaca7889739d&exported=True&hosted_timestamp=0042a345f27ac5dc0413802e189be385daf54a16310431f6ff8f92f7af39df48
              Accounts Payable Analyst/Treasury - (Boston)   
    Title Accounts Payable Analyst/TreasuryDescription ? Processing of accounts payable invoices to ensure that vendor payables are in accordance to contract terms and within predetermined performance measurementso Entry and distribution of invoices to appropriate business owner for approval o Process approved invoices prepare batches & submit to Supervisoro Provide reporting for vendor payable trendso Collaborate with General Ledger to ensure accurate account posting ? Manage Associate Expense reportso Ensure auditing of reports for compliance to company guidelineso Coordinate with Associate and/or Department managers to resolve discrepancies? Vendor Administrationo Maintain current vendor database according to company guidelineso Process and Track Daily Vendor Add / Change Requestso Conduct Bi monthly Vendor Due Diligence screening? Support Treasury Analyst Functionso Support Treasury Analyst in operational functionso Distribute daily cash reportingo Train and support weekly payable electronic fund transmission /check printo Manage and distribute Petty cash, complete month end reconciliation.o Assist with Month End Close Procedures, and other department responsibilities as requiredRequirements Qualifications:?
              Sunday Newspaper Coupon Inserts for (7/2/17)   

    This Sunday, July 2nd you might find 1 coupon insert in the Sunday paper. Even though it’s a considered a holiday weekend, the P&G did not come out last weekend, so we should get it on Sunday.  So look for: Procter & Gamble You can now search the Mission to Save Coupon Database to see what coupons will be included in each insert. Use the date and insert abbreviation (i.e- 3/9/14 RP) in the search box and choose “Inserts” from the source {Read More}

    The post Sunday Newspaper Coupon Inserts for (7/2/17) appeared first on Mission: to Save.


              Lake Elmer Thomas Rec Area (LETRA) (Military FamCamp) 5/10 - Lawton, Oklahoma   
    This park was full and all the best sites with a view are taken by full time people who live here. The restrooms and showers both men and women were in need of cleaning. We came to take grandkids swimming at the beach across the street. There were a couple hundred kids there and almost no place to swim or toss down a towel, so I guess every soldier here brought the kids out so swim. If you are not military you now have to enter thru the visitors center on Sheridian Street and Highway 62, about 20 miles away to get your visitors pass which is good up to a week I think. Then you have to drive your RV all the way thru the base at 20-35 miles per hour for about 20 miles. If you are passing thru this is not handy and takes lots of time, but if you are here for a military function on base it is great. Note: Most cell phones do not work here, and wifi is iffy at best. The little store only has snacks so be sure to buy food before you come. This is a military base, no guns, ammo, or knives with a blade more than 3 inches. You concealed carry permit is not legal on this base, nor your gun. You can register a gun to bring on base, but it goes into a national federal database, if that matters to you.
              Buy Quality COUNTERFEIT MONEY And fake Passports,Driver’s License,ID Cards,Visas.SSN (anthonymarc540@gmail.com )   

    Buy Quality COUNTERFEIT MONEY And fake Passports,Driver’s License,ID Cards,Visas.SSN (anthonymarc540@gmail.com )

    Hello We are the best producers of HIGH QUALITY COUNTERFEIT Banknotes, Getting a fake and a real (genuine) passport, ID or driving
    license or any other document is simple. we can make you both real and
    fake documents.
    However, the real documents are more expensive than the fake because
    it takes time, skill and contacts to get it done. Note that, the fake
    is going to be 100% unique and in very good quality. The difference is
    based on the registration of the numbers. The real Document will be
    registered with the country's database so you can use it to travel to
    any country of your choice or in the country, meanwhile the fake will
    not be registered but can be used as well.

    Contact e-mails.......... anthonymarc540@gmail.com
    General support.......... anthonymarc540@gmail.com

    fake USA(United States) passports,
    fake Australian passports,
    fake Belgium passports,
    fake Brazilian(Brazil) passports,
    fake Canadian(Canada) passports,
    fake Finnish(Finland) passports,
    fake French(France) passports,
    fake German(Germany) passports,
    fake Dutch(Netherland/Holland) passports,
    fake Israel passports,
    fake UK(United Kingdom) passports,
    fake Spanish(Spain) passports,
    fake Mexican(Mexico) passports,
    buy fake South African passports.
    buy fake Australian driver licenses,
    buy fake Canadian driver licenses,
    buy fake French(France) driver licenses,
    buy fake Dutch(Netherland/Holland) driving licenses,
    buy fake German(Germany) driving licenses,
    buy fake UK(United Kingdom) driving licenses,
    buy fake Diplomatic passports,
    buy fake USA(United States) passports,
    buy fake Australian passports,
    buy fake Belgium passports,
    buy fake Brazilian(Brazil) passports,
    buy fake Canadian(Canada) passports,
    buy fake Finnish(Finland) passports,
    buy fake French(France) passports,
    buy fake German(Germany) passports,
    buy fake Dutch(Netherland/Holland) passports,
    buy fake Israel passports,
    buy fake UK(United Kingdom) passports,
    buy fake UK(United Kingdom) driving licenses,
    buy fake Diplomatic passports,
    buy Camouflage passports,
    buy passport Duplicates,
    fake USA(united States) passports for sale,
    fake Australian passports for sale,
    fake Belgium passports for sale,
    fake Brazilian(Brazil) passports for sale,
    fake Canadian(Canada) passports for sale,
    fake Finnish(Finland) passports for sale,
    fake French(France) passports for sale,
    fake German(Germany) passports for sale,
    fake Dutch(Netherland/Holland) passports for sale,
    fake Israeli passports

    buy, get, fake, false, passport, passport, id, card, cards, uk, sell, online, canadian, british, sale, novelty, conterfeit, bogus, american, united, states, usa, us, italian, malaysian, australian, documents, idetity, idetification, driver, license, licence, driving, residence, permit, SSN fake passport id, free fake passport, identity theft, fake, novelty, camoflauge, passport, anonymous, private, safe, travel, anti terrorism, international, offshore, banking, id, driver, drivers, license, instant, online, for sale, cheap, wholesale, new identity, second, citizenship, identity, identification, documents, diplomatic, nationality, how to, where to, get, obtain, buy, purchase, make, build, a, passport, i.d., british, honduras, uk, usa, us, u.s., canada, canadian, foreign, visa, swiss, card, ids, document, getting, visas, cards, foriegn .

    ((MONEY))

    This bills are not home made but industrial and professional manufacturing. From High
    Quality IT techinicians from US,Russia,Korea and China We offer high quality COUNTERFEIT NOTES for the following currencies;

    EUR - Euro
    USD - US Dollar
    DNR - DINAR
    GBP - British Pound
    INR - Indian Rupee
    AUD - Australian Dollar
    CAD - Canadian Dollar
    AED - Emirati Dirham
    ZAR - Rand
    CHF - Swiss Franc
    CNY - Chinese Yuan Renminbi
    MYR - Malaysian Ringgit
    THB - Thai Baht
    NZD - New Zealand Dollar
    SAR - Saudi Arabian Riyal
    QAR - Qatari Riya

    Email-: anthonymarc540@gmail.com

    Text/call: +19162377298

    E-Mail Your Questions and Comments.

    We are looking forward to receiving your inquiries and early receipt of your first orders!

    SERIOUS INQUIRIES ONLY PLEASE


              Buy Quality COUNTERFEIT MONEY And fake Passports,Driver’s License,ID Cards,Visas.SSN (anthonymarc540@gmail.com )   

    Buy Quality COUNTERFEIT MONEY And fake Passports,Driver’s License,ID Cards,Visas.SSN (anthonymarc540@gmail.com )

    Hello We are the best producers of HIGH QUALITY COUNTERFEIT Banknotes, Getting a fake and a real (genuine) passport, ID or driving
    license or any other document is simple. we can make you both real and
    fake documents.
    However, the real documents are more expensive than the fake because
    it takes time, skill and contacts to get it done. Note that, the fake
    is going to be 100% unique and in very good quality. The difference is
    based on the registration of the numbers. The real Document will be
    registered with the country's database so you can use it to travel to
    any country of your choice or in the country, meanwhile the fake will
    not be registered but can be used as well.

    Contact e-mails.......... anthonymarc540@gmail.com
    General support.......... anthonymarc540@gmail.com

    fake USA(United States) passports,
    fake Australian passports,
    fake Belgium passports,
    fake Brazilian(Brazil) passports,
    fake Canadian(Canada) passports,
    fake Finnish(Finland) passports,
    fake French(France) passports,
    fake German(Germany) passports,
    fake Dutch(Netherland/Holland) passports,
    fake Israel passports,
    fake UK(United Kingdom) passports,
    fake Spanish(Spain) passports,
    fake Mexican(Mexico) passports,
    buy fake South African passports.
    buy fake Australian driver licenses,
    buy fake Canadian driver licenses,
    buy fake French(France) driver licenses,
    buy fake Dutch(Netherland/Holland) driving licenses,
    buy fake German(Germany) driving licenses,
    buy fake UK(United Kingdom) driving licenses,
    buy fake Diplomatic passports,
    buy fake USA(United States) passports,
    buy fake Australian passports,
    buy fake Belgium passports,
    buy fake Brazilian(Brazil) passports,
    buy fake Canadian(Canada) passports,
    buy fake Finnish(Finland) passports,
    buy fake French(France) passports,
    buy fake German(Germany) passports,
    buy fake Dutch(Netherland/Holland) passports,
    buy fake Israel passports,
    buy fake UK(United Kingdom) passports,
    buy fake UK(United Kingdom) driving licenses,
    buy fake Diplomatic passports,
    buy Camouflage passports,
    buy passport Duplicates,
    fake USA(united States) passports for sale,
    fake Australian passports for sale,
    fake Belgium passports for sale,
    fake Brazilian(Brazil) passports for sale,
    fake Canadian(Canada) passports for sale,
    fake Finnish(Finland) passports for sale,
    fake French(France) passports for sale,
    fake German(Germany) passports for sale,
    fake Dutch(Netherland/Holland) passports for sale,
    fake Israeli passports

    buy, get, fake, false, passport, passport, id, card, cards, uk, sell, online, canadian, british, sale, novelty, conterfeit, bogus, american, united, states, usa, us, italian, malaysian, australian, documents, idetity, idetification, driver, license, licence, driving, residence, permit, SSN fake passport id, free fake passport, identity theft, fake, novelty, camoflauge, passport, anonymous, private, safe, travel, anti terrorism, international, offshore, banking, id, driver, drivers, license, instant, online, for sale, cheap, wholesale, new identity, second, citizenship, identity, identification, documents, diplomatic, nationality, how to, where to, get, obtain, buy, purchase, make, build, a, passport, i.d., british, honduras, uk, usa, us, u.s., canada, canadian, foreign, visa, swiss, card, ids, document, getting, visas, cards, foriegn .

    ((MONEY))

    This bills are not home made but industrial and professional manufacturing. From High
    Quality IT techinicians from US,Russia,Korea and China We offer high quality COUNTERFEIT NOTES for the following currencies;

    EUR - Euro
    USD - US Dollar
    DNR - DINAR
    GBP - British Pound
    INR - Indian Rupee
    AUD - Australian Dollar
    CAD - Canadian Dollar
    AED - Emirati Dirham
    ZAR - Rand
    CHF - Swiss Franc
    CNY - Chinese Yuan Renminbi
    MYR - Malaysian Ringgit
    THB - Thai Baht
    NZD - New Zealand Dollar
    SAR - Saudi Arabian Riyal
    QAR - Qatari Riya

    Email-: anthonymarc540@gmail.com

    Text/call: +19162377298

    E-Mail Your Questions and Comments.

    We are looking forward to receiving your inquiries and early receipt of your first orders!

    SERIOUS INQUIRIES ONLY PLEASE


              Buy Quality COUNTERFEIT MONEY And fake Passports,Driver’s License,ID Cards,Visas.SSN (anthonymarc540@gmail.com )   

    Buy Quality COUNTERFEIT MONEY And fake Passports,Driver’s License,ID Cards,Visas.SSN (anthonymarc540@gmail.com )

    Hello We are the best producers of HIGH QUALITY COUNTERFEIT Banknotes, Getting a fake and a real (genuine) passport, ID or driving
    license or any other document is simple. we can make you both real and
    fake documents.
    However, the real documents are more expensive than the fake because
    it takes time, skill and contacts to get it done. Note that, the fake
    is going to be 100% unique and in very good quality. The difference is
    based on the registration of the numbers. The real Document will be
    registered with the country's database so you can use it to travel to
    any country of your choice or in the country, meanwhile the fake will
    not be registered but can be used as well.

    Contact e-mails.......... anthonymarc540@gmail.com
    General support.......... anthonymarc540@gmail.com

    fake USA(United States) passports,
    fake Australian passports,
    fake Belgium passports,
    fake Brazilian(Brazil) passports,
    fake Canadian(Canada) passports,
    fake Finnish(Finland) passports,
    fake French(France) passports,
    fake German(Germany) passports,
    fake Dutch(Netherland/Holland) passports,
    fake Israel passports,
    fake UK(United Kingdom) passports,
    fake Spanish(Spain) passports,
    fake Mexican(Mexico) passports,
    buy fake South African passports.
    buy fake Australian driver licenses,
    buy fake Canadian driver licenses,
    buy fake French(France) driver licenses,
    buy fake Dutch(Netherland/Holland) driving licenses,
    buy fake German(Germany) driving licenses,
    buy fake UK(United Kingdom) driving licenses,
    buy fake Diplomatic passports,
    buy fake USA(United States) passports,
    buy fake Australian passports,
    buy fake Belgium passports,
    buy fake Brazilian(Brazil) passports,
    buy fake Canadian(Canada) passports,
    buy fake Finnish(Finland) passports,
    buy fake French(France) passports,
    buy fake German(Germany) passports,
    buy fake Dutch(Netherland/Holland) passports,
    buy fake Israel passports,
    buy fake UK(United Kingdom) passports,
    buy fake UK(United Kingdom) driving licenses,
    buy fake Diplomatic passports,
    buy Camouflage passports,
    buy passport Duplicates,
    fake USA(united States) passports for sale,
    fake Australian passports for sale,
    fake Belgium passports for sale,
    fake Brazilian(Brazil) passports for sale,
    fake Canadian(Canada) passports for sale,
    fake Finnish(Finland) passports for sale,
    fake French(France) passports for sale,
    fake German(Germany) passports for sale,
    fake Dutch(Netherland/Holland) passports for sale,
    fake Israeli passports

    buy, get, fake, false, passport, passport, id, card, cards, uk, sell, online, canadian, british, sale, novelty, conterfeit, bogus, american, united, states, usa, us, italian, malaysian, australian, documents, idetity, idetification, driver, license, licence, driving, residence, permit, SSN fake passport id, free fake passport, identity theft, fake, novelty, camoflauge, passport, anonymous, private, safe, travel, anti terrorism, international, offshore, banking, id, driver, drivers, license, instant, online, for sale, cheap, wholesale, new identity, second, citizenship, identity, identification, documents, diplomatic, nationality, how to, where to, get, obtain, buy, purchase, make, build, a, passport, i.d., british, honduras, uk, usa, us, u.s., canada, canadian, foreign, visa, swiss, card, ids, document, getting, visas, cards, foriegn .

    ((MONEY))

    This bills are not home made but industrial and professional manufacturing. From High
    Quality IT techinicians from US,Russia,Korea and China We offer high quality COUNTERFEIT NOTES for the following currencies;

    EUR - Euro
    USD - US Dollar
    DNR - DINAR
    GBP - British Pound
    INR - Indian Rupee
    AUD - Australian Dollar
    CAD - Canadian Dollar
    AED - Emirati Dirham
    ZAR - Rand
    CHF - Swiss Franc
    CNY - Chinese Yuan Renminbi
    MYR - Malaysian Ringgit
    THB - Thai Baht
    NZD - New Zealand Dollar
    SAR - Saudi Arabian Riyal
    QAR - Qatari Riya

    Email-: anthonymarc540@gmail.com

    Text/call: +19162377298

    E-Mail Your Questions and Comments.

    We are looking forward to receiving your inquiries and early receipt of your first orders!

    SERIOUS INQUIRIES ONLY PLEASE


              Re: Trump's voter commission already stirring criticism   
    I have a premise about voting in elections. The Constitution begins with "WE the PEOPLE... I see that as the first cornerstone of the Constitution; as my first civil right and yes, duty. Anyone who causes a vote to be cast, for any office, for any person, for any party, for any referendum, for any measure when that person does not meet the requirements to vote in that election, precinct, county (parish), or state (commonwealth) has feloniously diminished the power or feloniously increased the power of every legal voter's vote. Thus an illegal vote is a violation of the civil rights every citizen with a legal right to vote, whether the citizen decides to vote or not. I do not want anyone, not a single person, who is eligible to legally vote in our country to be kept from enjoying that basic Constitutional civil right, once per election.

    It is, "WE the PEOPLE..." not, "I the people..." not "me the people..." not us people who know better than the rest of the people..." It is the attempt to subvert the will of the people to select their representatives. I don't care if the illegal vote cast is the one vote which would seat the candidate of my choice - I want that illegal voter to go to prison for an attempt to overthrow our basic governmental principle. It would be, "Too Bad Granny, but I'll visit you whenever I'm near the Federal Prison!" With that said...

    My lifelong profession caused me to move around the country on many occasions. I always registered to vote and voted wherever I became a new resident. It never occurred to me until the 2012 presidential election that my registrations going back to my first one in 1964 were never cancelled by me when I moved out of that jurisdiction.

    I wrote to each of the fourteen registrars with whom I had registered and asked for my voter participation history. Eleven replied. The other three were the earliest and may not have any records still on file, but, gee, it would have been nice to hear something from them.

    The eleven replies were interesting. In five instances complete voter participation histories were provided and corresponded with my actual voter participation history. Six replies provided more information than I expected.

    Here is what I really did not expect and hoped I would not find. My voter registration was used in six elections after I had moved out of those states, registered in another state and voted in another state in the same election. I guess I committed voter fraud six times or, more accurately, someone used my registration to commit voter fraud in six presidential elections!

    My father just complained but never registered to vote. My mother would not miss an election. As her executor and holder of her unrestricted Power of Attorney, I wrote to her last registrar for her voter participation record. I knew she was a diligent citizen, but I was dismayed to find she had risen from the dead to vote in the two presidential elections following her funeral.

    That is eight violations of the civil rights of every person who is legally eligible to vote in U.S. national elections and I am damn pissed about it. I only checked on two legal voters, my Mother and I, and found eight illegal votes. I am not the person anyone should try to convince that illegal voting is not a real problem. These eight instances don't even have a home where they can be tallied to create a national database. The FBI Agent I talked with about this just told me straight out - "This would be a lot of paperwork and would not be filed under anything but the legal voter's names. The suspects walked in where there were no cameras, pushed buttons or handled ballots that several hundred other people have probably handled. There is no chain of evidence." I took that to mean that my civil rights might have just as well been violated by a tornado or some other Act of God, except I know this was done by the act of people bent on committing the perfect crime. And because it is a perfect crime, how many times have each one of them done it and how many others have done it? We just don't know!

    Running for the President of the local high school's senior class is better regulated.

    Each of the Registrars provided me with instructions on how to cancel a voter registration in their jurisdictions in the future and assured me that my registrations in each of the jurisdictions had been nullified before my initial letter to them after they had followed their "normal procedures" to keep the registrations accurate.

    None of them replied when I wrote back to thank them, suggest they develop "normal procedures" which would work a little faster and ask who I should see to get those illegal votes purged from the vote tally.
    Posted by Michel Starker
              Systems Analyst (Teradata Database Developer) - McCain Foods (Canada) - Florenceville, NB   
    The world’s largest manufacturer of frozen potato specialties, McCain also produces frozen fruit and vegetables, appetizers, oven meals, juice, pizza and...
    From McCain - Tue, 04 Apr 2017 12:46:27 GMT - View all Florenceville, NB jobs
              Database .NET 22.2.6392.5 Portable   
    Database .NET 22.2.6392.5 Portable

    Database .NET - простая программа для управления различными СУБД. Вы можете создавать, вставлять, выбирать, обновлять, удалять, экспортировать (CSV, XML, TXT), распечатать таблицу данных и использовать SQL консоль. Поддерживает Access, Excel, БД, Firebird, MySQL, SQL Server, SQL Azure, SQLCE, SQLite, PostgreSQL, Oracle, DB2, OLEDB, ODBC и OData. Не требует инсталляции.
              Systems Analyst (Teradata Database Developer) - McCain Foods (Canada) - Florenceville, NB   
    The world’s largest manufacturer of frozen potato specialties, McCain also produces frozen fruit and vegetables, appetizers, oven meals, juice, pizza and...
    From McCain - Tue, 04 Apr 2017 12:46:27 GMT - View all Florenceville, NB jobs
              IT Support Technician - Smurfit Kappa - Saskatchewan   
    Proactively work on resolution of logged IT tickets via the IT Helpdesk Database. At Smurfit Kappa, we open up opportunities with forward-thinking customers by...
    From Smurfit Kappa - Wed, 19 Apr 2017 17:54:59 GMT - View all Saskatchewan jobs
              The CBCP’s fake idea that the poor are “blessed” is the reason the Philippines remains poor   
    To Filipinos, rich people are “evil” and a life focused on the acquisition of money is a life to be ashamed of. And so there really is no reason to continue wondering why Filipinos remain an impoverished people today. The very idea of aspiring to be rich seems, to Filipinos, to be a thought to be purged from their minds.
    Rather, Filipinos justify their poverty using the very Catholic idea that to be poor is to be “blessed” before the eyes of God. Indeed, if I were to choose one fake idea ingrained by the Roman Catholic Church in the mind of the Filipino that did the biggest damage to their psyche, it would be that one. Unfortunately for Filipinos, such wealth-destroying ideas continue to be peddled by their Roman Catholic Church. Filipinos are subject to a constant bombardment of poverty-blessedness drivel everywhere they tune their eardrums and plant their eyeballs — in the Catholic masses they troop to every Sunday, the telenovelas and Filipino films they are addicted to, and the victim-heroes their “activists” and politicians put up for worship.
    The irony that seems to fly above the heads of Filipinos’ poverty-worshipping and wealth-demonising “activists” and politicians is that it is the highly-focused pursuit of capital accumulation (a.k.a. wealth creation) that enabled human civilisation to build the very devices and Web services that allow them to Tweet and Share their poverty porn. Contrary to popular belief, Silicon Valley’s titans are no latte-sipping liberals whose idea of “making a difference” in the world is sitting in a Starbucks café waxing poetic about “world peace” and sending relief goods to war refugees. Bill Gates wanted to dominate our desktop PCs with his software, Steve Jobs wanted to make computers that appeal to affluent artsy people, Zuck wanted to pickup chicks on the Internet, Jack Dorsey wanted to build a Facebook-killer people could use from their mobile phones, and Sergey Brin sought to organise humanity’s collective knowledge into a giant database.
    In case I missed some kind of Catholic pastoral letter on the subject, I really can’t see what is so “evil” about what these five brilliant — and mega-rich — human beings did.
    Indeed, as much as Filipinos would like to attribute everything that is wrong with their society to the “evil” devices of 16th-Century Imperial Spain, it was Spain’s pursuit of gold that brought their ships to the beaches of Cebu and the building of the city of Vigan (among other architectural wonders) that Filipinos now put up as the “pride” of “their” tourism industry. Perhaps, in some fairness to the Catholic Church, there is some evidence that the Spanish conquistadores found the natives of the islands in pretty much the same state as the way Filipinos are even today in 21st Century Philippines.
    In his Inquirer column, Ambeth Ocampo writes how painfully-relevant the observations made by the Spaniards on the natives they found in the Philippines are to this day…
    When I was a student, everything bad in our character was blamed on the colonial experience: on Spain, the United States and Japan. Reading Legazpi made me wonder if we had always been the way we are:
    “These people declare war among themselves at the slightest provocation, or with none whatever. All those who have not made a treaty of peace with them, or drawn blood with them, are considered as enemies. Privateering and robbery have a natural attraction for them. Whenever the occasion presents itself, they rob one another, even if they be neighbors or relatives; and when they see and meet one another in the open fields at nightfall, they rob and seize one another. Many times it happens that half of a community is at peace with half of a neighboring community, while the other halves are at war. They assault and seize one another; nor do they have any order or arrangement in anything. All their skill is employed in setting ambuscades and laying snares to seize and capture one another, and they always try to attack with safety and advantage to themselves.”
    To some degree this revelation that Filipinos already possessed the Poor DNA before the “evil” Spaniards arrived absolves the Roman Catholic Church of some accountability for why Filipinos are imprisoned by impoverished thinking today. But armed with all this hindsight that guys like Ocampo are kind enough to share with us today, there really is no excusing the Catholic Church and its henchmen in the Catholic Bishops Conference of the Philippines (CBCP) continuing to propagate its brain-damaging fake ideas today.
    There is much to be done. As the old cliché goes, you gotta think rich to become rich. Filipinos need to purge their culture of memetic relics that contribute to impoverished thinking — that money and rich people are “evil”, that wealth is more a result of swerte (“luck”) than of hard work and clever ideas, that complex ideas articulated in English cause “nosebleed”, and, of course, the old Catholic notion that the poor are “blessed”. There is nothing “blessed” about being poor. Being poor sucks. As Mae West was said to have said: “I’ve been rich and I’ve been poor, and rich is better.” To be fair, Filipinos who, as a people, have never been rich probably wouldn’t get it. Yet.

    About benign0

    benign0 is the Webmaster of GetRealPhilippines.com.

    http://www.getrealphilippines.com/blog/2017/06/the-cbcps-fake-idea-that-the-poor-are-blessed-is-the-reason-the-philippines-remains-poor/

              Product Database Specialist - Teknion Limited - Toronto, ON   
    Baan, Operations, PPG etc. Teknion creates furniture that connects people, technology and spaces....
    From Teknion Limited - Fri, 23 Jun 2017 23:43:48 GMT - View all Toronto, ON jobs
              Security and Compliance Assistant - PSA Airlines, Inc. - Vandalia, OH   
    Maintain and assist in auditing the company databases for airline identification media, United States. Let your career take off with PSA Airlines....
    From PSA Airlines, Inc. - Wed, 21 Jun 2017 21:04:04 GMT - View all Vandalia, OH jobs
              White House Panel Asks States For Their Voter Rolls   
    A letter from Kris Kobach, the vice chairman of a White House commission looking into voter fraud and other irregularities, is drawing fire from some state election officials. The letter, sent Wednesday to all 50 states, requests that all publicly available voter roll data be sent to the White House by July 14, five days before the panel's first meeting. The information requested includes the names, addresses, birthdates, political party (if recorded), last four digits of the voter's Social Security Number and which elections the voter has participated in since 2006, for every registered voter in the country. Kobach, who is also Kansas' Republican secretary of state, did not say how the commission plans to use the data other than to help it "fully analyze vulnerabilities and issues related to voter registration and voting." However, Kobach has long advocated comparing state voter rolls with other government databases to identify noncitizens or other illegitimate registrants. Voter
              Coupon Insert Preview | Week of July 2, 2017   
    There is 1 confirmed Coupon Inserts in the paper this weekend (July 2, 2017)… a Proctor & Gamble insert. You can check it out on my Coupon Insert Preview page! Looking for the coupons that came out in recent weeks? Be sure to check out my coupon database for all the current non-expired insert and printable coupons!...

    Read More »

     

              Coupon Insert Preview | Week of June 25, 2017   
    There are 2 confirmed Coupon Inserts in the paper this weekend (June 25, 2017)… a SmartSource and a Redplum insert. You can check it out on my Coupon Insert Preview page! Looking for the coupons that came out in recent weeks? Be sure to check out my coupon database for all the current non-expired insert and printable...

    Read More »

     

              Coupon Insert Preview | Week of June 11, 2017   
    There are 2 confirmed Coupon Inserts in the paper this weekend (June 11, 2017)… a SmartSource and a Redplum inseert. You can check it out on my Coupon Insert Preview page! Looking for the coupons that came out in recent weeks? Be sure to check out my coupon database for all the current non-expired insert and printable...

    Read More »

     

              Coupon Insert Preview | Week of June 4, 2017   
    There are 3 confirmed Coupon Inserts in the paper this weekend (June 4, 2017)… a SmartSource and 2 Redplum inserts. You can check it out on my Coupon Insert Preview page! Looking for the coupons that came out in recent weeks? Be sure to check out my coupon database for all the current non-expired insert and printable...

    Read More »

     

              Database Technology Manager   

              Apple iPhone 4S Contract Deals Compared   

    After months of speculation about an iPhone 5, a premium and a budget version and other wild rumours, the truth surrounding the latest iPhone was unveiled on 4th October.

    The new iPhone 4S is more of an upgrade to the existing iPhone 4 and on the outside looks identical with the same casing. The new smartphone does however come with a number of upgrades under the hood, most notably an dual core processor and 8 mega-pixel camera. There is also the new Siri, voice activated assistant and it ships with the new iOS 5 platform.

    There is not much we can really add to the plethora of reviews and details about the iPhone 4S already available online but what we can bring you is the latest iPhone 4S prices and deals from the leading UK retailers, stick them in a sortable database and let you find the best contract to suit your budget and usage.

    We have found sim free, unlocked deals starting at £520 or get the handset free on a 24 month contract paying £40.85. There are a few hundred deals on each configuration (black, white, 16GB, 32GB and 64GB models) with all the major networks and retailers.

    View all iPhone Deals...

     


              Learn Microsoft SQL & DataBase concepts from scratch   

              Rash 2.0 Beta   

    I've had most of this code written and then kind of slacked off for a long time. The beta release of Rash 2.0 is for bugfixing and such and will not work with your current databases.

    Keep in mind the real release of 2.0 will change your databases to work properly, and will set itself up with no problems, this release is just to give out the code that I've written.

    All of the requested features except for auto-approve have already been implemented into the beta release; the only problem occurs with localization -- I only have US-english haha.

    Read the readme file in the beta release when it comes out -- should be in an hour or so.


              Senior People Analyst - Tesla Motors - Fremont, CA   
    Partner with the HRIS team to maintain HR databases, audit data, prepare reports, propose business process improvements and automation....
    From Tesla Motors - Wed, 10 May 2017 21:18:12 GMT - View all Fremont, CA jobs
              Systems Analyst (Teradata Database Developer) - McCain Foods (Canada) - Florenceville, NB   
    McCain Foods is seeking a Systems Analyst, specialized in Teradata database development, to contribute to the success of our Enterprise Data Warehouse (EDW)...
    From McCain - Tue, 04 Apr 2017 12:46:27 GMT - View all Florenceville, NB jobs
              Senior People Analyst - Tesla Motors - Fremont, CA   
    Partner with the HRIS team to maintain HR databases, audit data, prepare reports, propose business process improvements and automation....
    From Tesla Motors - Wed, 10 May 2017 21:18:12 GMT - View all Fremont, CA jobs
              7月1日(土)のつぶやき その1   

    ☆KESHA☆ケシャ・ローズ・セバート(Kesha Rose Sebert、1987年3月1日 はベジタリアンで動物愛護活動家です(~▽~@)♪♪ja.m.wikipedia.org/wiki/%E3%82%B1…

    — james.k (@junn0817) 2017年7月1日 - 00:27

    心無いコメントがあるたびに悲しい気持ちになる。これが現実なんだ!自己満足の偽善者だと罵られても毛皮反対の啓蒙活動は止めない!動物たちにも命の尊厳がある!必要でない命の犠牲は避けるべきだ! s.ameblo.jp/amistad-jun007…

    — james.k (@junn0817) 2017年7月1日 - 00:57

    "@BatouCode2501: 日本で動物関係の仕事で、資格が必要なのは獣医だけ。

    — james.k (@junn0817) 2017年7月1日 - 01:27

    ブリーダーも、トレーナーも、ペットショップも、国家資格などはない。"法規制が不可欠ですね

    — james.k (@junn0817) 2017年7月1日 - 01:57

    Love the life you live. Live the life you love. Bob Marley (ボブ・マーリー) 自分の生きる人生を愛せ。自分の愛する人生を生きろ。(ジャマイカのレゲエミュージシャン / 1945~1981)

    — james.k (@junn0817) 2017年7月1日 - 04:27

    2月 06, 2014のツイート。

    — james.k (@junn0817) 2017年7月1日 - 06:27

    どうか毛皮の真実を知って下さい!転載「ラビットファーの真実(残酷なウサの画像が削除されたので、他の犠牲となったウサを貼付)」 ameblo.jp/yayamin/entry-… pic.twitter.com/9BFzziJoia

    — james.k (@junn0817) 2017年7月1日 - 06:57

    報告の転載です!「60匹飼育放棄:ご支援いただいた皆様」 amba.to/1opbFSs

    — james.k (@junn0817) 2017年7月1日 - 07:27

    ひまわりやパンジーは食べなくて、キャベツやレタスは食べてもいいということまで「サベツ」ということになりますよ?"←Yourself know the answer

    — james.k (@junn0817) 2017年7月1日 - 07:57

    思い切ってやりなさい。責任は私がとる。- 西郷隆盛 -

    — james.k (@junn0817) 2017年7月1日 - 08:27

    マダニが活動を再開しました。気をつけて下さい。 from4050.jp/wp/%E5%8C%BB%E…

    — james.k (@junn0817) 2017年7月1日 - 08:57

    T拡散願 残虐画像無し ・どの動物の毛皮にも共通している事それは、生きたまま剥がされる。"@junn0817: 至るところに毛皮の商品が氾濫しています!100均にもあります!可愛いの裏側には動物虐待、動物虐殺の現実が存在します。 pic.twitter.com/TwYpVKADh1"

    — james.k (@junn0817) 2017年7月1日 - 09:27

    全世界の温室効果ガス排出の18%が畜産業関連と言われ、中でも牛肉生産による排出量が最も多い。afpbb.com/articles/-/257…

    — james.k (@junn0817) 2017年7月1日 - 09:57

    食肉の危険性!!emerald-company.com/database_dange…

    — james.k (@junn0817) 2017年7月1日 - 10:27

    私が肉を食べない理由。転載◆と畜(と殺)見学に行ってきました。 blog.goo.ne.jp/grandemperor/e… @vegetarian_kei

    — james.k (@junn0817) 2017年7月1日 - 10:57

    私が肉を食べない理由。転載!「屠殺場レポート~全て見せます。加古川食肉センター」 amour918.blog.fc2.com/blog-entry-803…

    — james.k (@junn0817) 2017年7月1日 - 11:27

    福沢諭吉「自由独立」不自由のなかに自由は存在する!やらなくてはならない事をしっかり行うからこそ自由が満喫できる!やるべき事も何もないところに自由は存在しない。

    — james.k (@junn0817) 2017年7月1日 - 11:57

    President Geun-Hye Park:: Take Dog & Cat Meat off the Menu! change.org/en-GB/petition… @ChangeAUSから

    — james.k (@junn0817) 2017年7月1日 - 12:27

    Dog Meat Trade: youtu.be/ynq4M2LZ0sY

    — james.k (@junn0817) 2017年7月1日 - 12:57

    Stopping the killing of cats and dogs in China and any other Asian Country. - PetitionBuzz petitionbuzz.com/petitions/agai…

    — james.k (@junn0817) 2017年7月1日 - 13:27

    Stop euthanasia of strays in Romania change.org/petitions/stop… @Changeから

    — james.k (@junn0817) 2017年7月1日 - 13:57

    (twitter.com/junn0817/statu…)

    — james.k (@junn0817) 2017年7月1日 - 14:27

    ラビットファーの真実 | 毛皮(リアルファー)はいらない no-fur.org/about/usagi/de…

    — james.k (@junn0817) 2017年7月1日 - 14:57

    肉食の食事が多い方は是非ともご覧ください h2.dion.ne.jp/~apo.2012/daiy…

    — james.k (@junn0817) 2017年7月1日 - 15:27

    虐待50年、解放されたゾウが「涙」 インド cnn.co.jp/fringe/3505076… @cnn_co_jpから

    — james.k (@junn0817) 2017年7月1日 - 15:57

    「小さな命を守れるか守れないかは、社会の成熟度のバロメーターだと思う。財団の活動を通して少しでも社会の成熟度を上げることに貢献したい。」(滝川クリステル氏)

    — james.k (@junn0817) 2017年7月1日 - 16:27

    転載!滝川クリステル「涙を流す象」 amba.to/1sowcg8

    — james.k (@junn0817) 2017年7月1日 - 16:57

    警告!マクドナルドの肉はヤバイ!yomiuri.co.jp/world/20140723…

    — james.k (@junn0817) 2017年7月1日 - 17:27

    猫ブロガーになるニャ!可愛く自撮りをしている猫ちゃん達13選 curazy.com/archives/989

    — james.k (@junn0817) 2017年7月1日 - 17:57

    【収納上手】箱に綺麗に収まるネコたち20選 curazy.com/archives/16871

    — james.k (@junn0817) 2017年7月1日 - 18:27
              mysqlcheck : Check, Repair, Optimize MySQL Tables   

    mysqlcheck Is A Command Line Utility To Check, Repair, Optimize MySQL Tables. It Demands Some About Handling Database Which We Have Discussed.

    Read the full mysqlcheck : Check, Repair, Optimize MySQL Tables on The Customize Windows written by Abhishek Ghosh


              Database .NET 22.2.6392.5 Portable   
    Database .NET - простая программа для управления различными СУБД. Вы можете создавать, вставлять, выбирать, обновлять, удалять, экспортировать (CSV, XML, TXT), распечатать таблицу данных и использовать SQL консоль. Поддерживает Access, Excel, БД, Firebird, MySQL, SQL Server, SQL Azure, SQLCE, SQLite, PostgreSQL, Oracle, DB2, OLEDB, ODBC и OData. Не требует инсталляции.
              Comment on Automating Xojo Database Development with ARGen by Bob Keeney   
    Hm...since you already have the AR object m_oRecord you, in reality, already have the property on your window. I see no point in mirroring it with a local property on the window. Instead of naming a property ms_Results you already m_oRecord.s_Results and because of AutoComplete this will be easy to do on your own. Sorry, I don't think we'll be implementing this feature any time soon unless a lot of people start asking for it.
              Data Analyst - Sollers - Edison, NJ   
    Technical expertise regarding data models and database design development. Data analysts will develop analysis and reporting capabilities.... $75,000 a year
    From Indeed - Fri, 02 Jun 2017 19:43:07 GMT - View all Edison, NJ jobs
              QA Tester - Noviur Technologies - Vaughan, ON   
    Extracting data from database to cross reference against expected results within test scripts Design, develop and maintain test plan and test cases utilizing...
    From Noviur Technologies - Thu, 08 Jun 2017 04:21:22 GMT - View all Vaughan, ON jobs
              Database Administrator (Tourism Industry)   
    South Africa - Our Client, a spectacular and high-quality tourism services company based in Century City, currently seeks a Systems/Contracting Database.... Minimum Requirements: 2 – 3 years’ experience in systems/database administration, preferably within the travel/tourism industry Matric is essential Key...
              We’d do it again in a heartbeat!   
    Launching a new website on a new platform—and integrating that platform with a new database and collaborative site—was a daunting prospect for our small nonprofit. Someone suggested that we seek help from Brett Wangman at TCAG, and we are forever grateful for that recommendation. Working with Brett has been a pleasure from start to finish. […]
              E: 31/07 Win A £200 Shearer Candles Voucher, Princes Square   
    *Princes Square* Entry into this competition shows your agreement to be added to the Princes Square customer database.
              (USA-OH-Cleveland) Reimbursement-Specialist   
    **Job Summary:** Monitors, reviews and applies correct coding principles to clinical information received from ambulatory areas for the purpose of reimbursement, research and compliance. Identifies and applies diagnosis codes, cot codes and modifiers as appropriately supported by the medical record in accordance with federal regulations. Ensures that billing discrepancies are held and corrected. **Responsibilities:** + Compares and reconciles daily patient schedules/census/registration to billing and medical records documentation for accurate charge submission, which includes (but not limited to) processing of professional charges, facility charges, manual data entry. + Maintains records to be used for reconciliation and charge follow up. + Investigates and resolves charge errors. + Meets coding deadlines to expedite the billing process and to facilitate data availability for CCF providers to ensure appropriate continuity of care. + Responsible for working professional held claims in CCF claims processing system. + Reviews, abstracts and processes services from surgical operative report. + Reviews, communicates and processes physician attestation forms. + Communicates with physician and other CCF departments (co-surgery) to resolve documentation discrepancies. + Assists with Evaluation and Management (E&M) audits and other reimbursement reviews. + Responsible for working E&M denials on the denial database. + Other duties as assigned. **Education:** + High school diploma or equivalent. + Specific training related to CPT procedural coding and ICD9 CM diagnostic coding through continuing education programs/ seminars and/or community college. + Working knowledge of human anatomy and physiology, disease processes and demonstrated knowledge of medical terminology. **Certifications:** + Certified Professional Coder (CPC), Certified Coding Specialist Physician (CCS-P), Registered Health Information Technologist (RHIT), Certified Coding Associate (CCA) by American Health Information Management Certification (AHIMA). **Complexity of Work:** + Requires critical thinking and analytical skills, decisive judgment and work with minimal supervision. + Requires excellent communication skills to be able to converse with the clinical staff. + Applicant must be able to work under pressure to meet imposed deadlines and take appropriate actions. **Work Experience:** + Minimum of 3 years coding to include 1 year of complex coding experience in a health care environment and or medical office setting required. + Must demonstrate and maintain accuracy and proficiency in coding and claims editing to be considered for a Professional Coder III position. + Candidate must currently be employed as a Professional Coder II at the Cleveland Clinic or have met all the training, quality and productivity benchmarks of a Professional Coder II. **Physical Requirements:** + Typical physical demands involve prolonged sitting and/or traveling through various locations in the hospital and dexterity to accurately operate a data entry/PC keyboard. + Manual dexterity required to locate and lift medical charts. + Ability to work under stress and to meet imposed deadlines. **Personal Protective Equipment:** + Follows Standard Precautions using personal protective equipment as required for procedures. The policy of Cleveland Clinic and its system hospitals (Cleveland Clinic) is to provide equal opportunity to all of our employees and applicants for employment in our tobacco free and drug free environment. All offers of employment are followed by testing for controlled substance and nicotine. Job offers will be rescinded for candidates for employment who test positive for nicotine. Candidates for employment who are impacted by Cleveland Clinic’s Smoking Policy will be offered smoking cessation assistance and will be permitted to reapply for open positions after 90 days. Decisions concerning employment, transfers and promotions are made upon the basis of the best qualified candidate without regard to color, race, religion, national origin, age, sex, sexual orientation, marital status, ancestry, status as a disabled or Vietnam era veteran or any other characteristic protected by law. Information provided on this application may be shared with any Cleveland Clinic facility. Cleveland Clinic is pleased to be an equal employment employer: Women/Minorities/Veterans/Individuals with Disabilities
              (USA-OH-Akron) Analyst-Physician-Contracts   
    **Description:** Under the direct supervision of the Manager of Finance PPG the Physician Contracts Analyst for the Partners Physician Group is responsible for the maintenance, tracking, and payments related to physician contracts. Job Duties: Responsible for the organization and tracking of all Akron General(Partner Physician Group and Akron General Medical Center Clinic) physician contracts and MOUs between entities (CCF, AGMC, etc.) including the development of detailed checklists for each individual physician contract Completes contract review and validation to ensure that all appropriate information is set within the contract and that compensation/bonus structures are operationalized. Responsible for calculation and payment of Physician and Nurse Practitioner bonuses on a quarterly, semi-annual basis, and annual basis. Monitors physician performance to expectations as stated in the physician contract and creates periodic performance reports for review by the Manager of Finance. Develops financial projections for physicians interested in joining Partners Physician Group (PPG). Prepares payroll, within contract terms, for all physicians on a bi-weekly basis and prepares monthly journal entries to allocate payroll dollars to the correct entities. Tracks physician timesheets and makes payments based on tracked hours, and within contract terms, as necessary. Participates in weekly CCAG Physician Contract Committee Meetings. **Qualificiations:** Bachelors degree in Business (Accounting, Finance or related). At least three years experience in accounting or finance. Demonstrated experience in financial management in a physician practice management setting. Knowledge of contract analysis and implementation. Attention to detail is a must along with strong written and verbal communication skills. Proficient personal computing skills including databases, spreadsheets and word processing. Ability to stay organized while working on multiple projects simultaneously and accomplish project deadlines. Ability to facilitate positive physician relations and interact effectively with peers, staff and administration. The policy of Cleveland Clinic and its system hospitals (Cleveland Clinic) is to provide equal opportunity to all of our employees and applicants for employment in our tobacco free and drug free environment. All offers of employment are followed by testing for controlled substance and nicotine. Job offers will be rescinded for candidates for employment who test positive for nicotine. Candidates for employment who are impacted by Cleveland Clinic’s Smoking Policy will be offered smoking cessation assistance and will be permitted to reapply for open positions after 90 days. Decisions concerning employment, transfers and promotions are made upon the basis of the best qualified candidate without regard to color, race, religion, national origin, age, sex, sexual orientation, marital status, ancestry, status as a disabled or Vietnam era veteran or any other characteristic protected by law. Information provided on this application may be shared with any Cleveland Clinic facility. Cleveland Clinic is pleased to be an equal employment employer: Women/Minorities/Veterans/Individuals with Disabilities
              Oracle Database Administrator - Splice - Ontario   
    Support for QA. Create scripts to apply the changes needed to migrate to a new build. Develop and maintain the scripts and programs that control the DB build...
    From Splice - Mon, 19 Jun 2017 11:47:07 GMT - View all Ontario jobs
              Web Development Masterclass: 100s of Tutorials, 20 Unique Sections - only $17!   
    NOW ON: Web Development Masterclass: 100s of Tutorials, 20 Unique Sections - only $17!

    Expires: July 7, 2017, 11:59 pm EST


    Now you can learn everything there is about building and maintaining websites with this super info-packed Mighty Deals Exclusive: Web Development Masterclass! In 20 different sections, this online video tutorial covers everything from teaching you the latest scripting languages to walking you through the installation and administration of everything from LAMP stack to test servers. Best of all, you'll learn at your own pace with a lifetime to soak it all in.

    Highlights:

    • Learn everything you need to know to master Web Development.
    • Tackle the latest scripting languages:
      • HTML
      • CSS
      • JavaScript
      • jQuery
      • Bootstrap
      • PHP
      • XML
      • AJAX
    • Master setup and installations - domain name registration, nameserver and DNS Zone Files configurations, Ubuntu on a Virtual Machine, MAMP for Mac and more!
    • Create unique animations, validation algorithms, testing servers, disk backups and so much more.
    • Install and administer key components such as LAMP Stack, MySQL databases, Security Settings, a remote server using PuTTY and more.

    Navigating the Course Interface:

    Testimonials:

    "I took this course to learn more about jQuery. They teach a lot of nice effects and animations. Very happy I took the training." - Gary Pope

    "This course is very well-rounded. It covers all the important concepts relevant to web development. I heard about this course through one of my co-workers. We are both very happy with the training."Neal Matthews

    "I had a great time with this course. The project in the last section was exactly what I was looking for. If you are interested in learning more about AJAX development, take the course!" - Steve Cain

    Pricing:

    This invaluable course normally sells for $299, but for a limited time only, you can get this career-building Web Development Masterclass for a mere $17! That's a whopping 94% off the regular price.

    Click the BUY NOW button to start learning today!

    Deal terms:
    • After completing your purchase, you will be provided with a coupon code to register at the vendor's website, to gain lifetime access to the masterclass.
    • Includes full support from instructors.
    • The course is compatible with MAC and PC users. All required software downloads are free.
    • Some basic computer knowledge is required (creating files/folders, installing programs etc).
    • Lectures are recorded in HD and must be streamed - requires stable internet connection.
    • The course is licensed on a per user basis. It may not be sold, redistributed, or given away in any shape or form.

              (USA) Fiscal Follow Up Lead   
    Job Description **Summary** The Fiscal Follow-Up Lead supports the Follow-Up team with Fiscal findings cited in reports across all regions. This position is responsible for reviewing and analyzing each fiscal-related finding and developing recommended activities to ensure correction during follow-up reviews. The Fiscal Follow-Up Lead participates in conducting reviews (Follow-Up, Targeted, etc.) as assigned. The Fiscal Follow-Up Lead identifies potential system issues trends and gaps as they review findings from all review types. The Fiscal Follow-Up Lead ensures sufficient data is collected to form the basis of performance assessment under the current monitoring process. The Fiscal Follow-Up Lead works collaboratively with Follow-Up Leads assigned to Regions, Regional Office staff, grantees and the Follow Up Lead Manager and provides subject matter expertise as needed. **Essential Duties** General: * Participates in all training regarding the HSPPS and develops a deep understanding of how the HSPPS align with the monitoring process * Understands and maintains a strong understanding of the HSPPS, HS Actand Uniform Guidance and how all integrate with the follow up and current monitoring process * Participates in all required trainings and assists with the development and testing of protocols and technology * When requested, works with the Follow-Up Lead, Manager and CTO to develop a database that accurately tracks and reports all follow ups and grantee findings using relevant technology. Participates in testing of data systems. * Conducts an analysis of the grantee’s review report results to identify key factors for the development of a customized Follow-up Recommendation Guide (FURG) (Fiscal findings specifically) * Assist in developing resources, training materials etc. for internal staff and Regional Offices * Serves as a reviewer as needed for on-site targeted review events or on-site follow ups * Participates in Regional Office trainings as requested to ensure follow-ups and targeted reviews are completed in a timely manner and RO expectations and needs are being met. * Other responsibilities as assigned In collaboration with the Assigned Follow-Up Lead (Regional Office Collaboration (direct and indirect)): * Maintain knowledge of Fiscal findings, schedule, history, and areas for attention; and provides support and subject matter expertise to Follow-Up Leads and Manager as needed. * Supports Follow-Up Leads in maintaining a current tracking/report of all fiscal findings within all regions; and provides regular updates to Follow-Up Leads for reports to the Regional Program Manager and Regional Monitoring Lead. * Serves as reviewer when requested by DLH/Regional Office for targeted reviews * At the request and approval of the Follow-Up Lead, Manager, coordinates and conducts Follow-Up Reviews for Regions on behalf of Program Specialists * Assist in customizing follow up reviews for individual regions, including working with the assigned Follow-Up Lead in scheduling and planning onsite follow-ups, and working with identified onsite reviewers and Program Specialists to carry out onsite reviews, as necessary, and ensuring any extensions for corrective action periods and changes to review dates are approved and adjustments are made in the system. In collaboration with Follow-Up Leads and the Follow-Up Lead Manager (Internal DLH Head Start Collaboration): * Provides grantee information across reviews to communicate and coordinate with Managers and Leads on performance and fiscal findings. * Provides evidence and narrative analysis of reviews which include fiscal findings and which are conducted by Regional Program Specialists to ensure the client’s expectations are being met by DLH. * Evaluate the sufficiency and appropriateness of grantee evidence – assist Follow-Up Leads in helping Program Specialist with identifying gaps. * Conduct data analysis and final review of Program Specialist’s narrative (Fiscal) to ensure clarity, accuracy and completeness. * Coordinates and supports the Follow-Up Lead or other DLH staff in follow-up reviews or Targeted reviews with grantees to show areas of noncompliance and/or corrections and improvements within appropriate timelines. Required Skills Required Experience Bachelor's in related field with 6+ years of directly related work experience or compatible combination of education and experience. Ability to work effectively in a virtual environment. Ability to travel (30%) and act as reviewer on on-site follow ups or targeted reviews. *DLH Corporation is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, pregnancy, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.* *Tracking Code:* 20264 *Job Location:* Virtual/Remote, , United States *Position Type:* Full-Time/Regular *Location:* Virtual/remote**
              (USA-IL-Chicago) Senior Gifts Officer   
    Imagine the opportunity to bring your talent to a high-performing organization making a meaningful difference in the lives of men, women and children all across Chicago. We are one of the nation’s premier, nonprofit organizations dedicated to feeding hungry people while striving to end hunger in our community. We are dynamic, mission-focused, innovative, values-driven, diverse, focused, impactful and energetic. We believe in inspiring change and know change only happens when everyone has a voice and a place at the table. It starts by unleashing the joy and potential in each other and the people we serve. The Senior Gift Officer is a newly created role on the Individual Giving team. The Senior Gifts Officer is responsible for the overall management of the major gifts and planned giving program. Additionally, this individual will be charged with managing relationships and securing support from donors who are capable of making gifts of $100,000 and above. This individual will manage a personal portfolio of approximately 80-100 prospects, as well as manage 2 staff. The Senior Gifts Officer will work with Director of Individual Giving to develop and implement a plan for expanded major gift/planned giving fundraising. The Senior Gifts Officer will also work closely with other members of the development and leadership team to promote the acquisition of major and planned gifts as well significant leadership gifts. The candidate will be an individual with a rich development background, who has had progressive and successful development experience in major/planned gifts and capital campaign experience. The ideal candidate should have a strong track record of management experience and cares deeply about mentoring gift officers. + Develop annual plan that includes clear goals and objectives for growing major gifts as well as metrics to monitor success. Ability to understand the objectives of the individual giving program and to integrate the major gifts program with those objectives. + Supervisory responsibility of Major Gifts Officer and Manager of Major Gifts. + Develop “top prospect” lists based on giving histories, research and anecdotal knowledge. + Create individualized cultivation strategies for a select number of donors each quarter based on guidance from senior management, and overall fundraising strategies. + Work with Director of Individual Giving and senior management to execute solicitations of potential major gift donors. + Oversee annual solicitation strategy and moves management process for major gifts team and senior leadership. + Oversee creation of personalized proposals for donor solicitations. + Oversee appropriate recognition opportunities for major donors, which may include special events, targeted print and online publications, and/or physical sites for permanent acknowledgements. + Oversee creation of personalized stewardship reports for major donors. + Collaborate with other team members regarding stewardship. + Bachelor’s degree with 7+ year’s development experience + Experience managing staff required + Experience with capital campaigns and major gift solicitation + Planned Giving Experience preferred + Demonstrated initiative and willingness to take on new projects + Superb interpersonal skills and the ability to actively listen + Excellent written and oral communication skills + Effective presentation and negotiation skills + Experience working with high-level donors, volunteers and colleagues on collaborative activities + Highly organized & detail oriented + Adaptable and flexible in fast-paced environment, able to multi-task and handle several priorities simultaneously + Team oriented focus with the ability to work independently + Proficiency in MS Office – Word, Excel, PowerPoint, Outlook + Development database experience (Raisers Edge experience desirable) ID: 2017-1400 External Company Name: Greater Chicago Food Depository External Company URL: gcfd.org
              VTech’s ‘Learning Lodge’ hack exposes kids’ data   
    In a statement today, VTech, maker of gadgets for children, announced that its customer data has been breached. According to the company, an “unauthorized party” accessed data within the Learning Lodge app store database earlier this month; that database has neither banking info nor certain personal identification details like social security numbers, but does have ‘general’ profile info. VTech has … Continue reading
              Dedication of Sandusky’s City Building in 1958   

    Local photographer Bob Frank took several photographs on the occasion of the dedication of Sandusky’s City Building on Meigs Street on  May 4, 1958.  Master of Ceremonies for the Dedication was John LaFene. A mobile studio from WLEC Radio was on hand to broadcast the event. Several local policemen stood at attention in front of the building.


    Representatives of Amvets Post 17 presented an American flag to ex-officio Mayor Stuart Gosser, and the Commodore Denig Post, American Legion presented the Ohio flag. Longtime state legislator Ethel Swanbeck can be seen seated on the stand.


    The Sandusky High School Band accompanied Pamela Mielke as she sang the Star Spangled Banner.


    After the flag raising ceremony, over one hundred city employees conducted guided tours of the newly dedicated City Building.

    Visit Sandusky Library's online database to view several more pictures from the dedication of the City Building in 1958.
              Consulting Database Specialist - OCLC - Dublin, OH   
    You have a life. We like that about you. At OCLC, we believe you'll do the best work of your life when you're living the best life possible. We work hard to...
    From OCLC - Tue, 30 May 2017 19:09:34 GMT - View all Dublin, OH jobs
              MAXIMUM ROCKNROLL PRESENTS: "Punk From The Middle East And Asia"   

    Last year Maximum Rocknroll had a fundraiser of sorts for their record archive. It passed a "collection" many years ago, and has evolved into one of the most comprehensive archives of punk vinyl in the world, well over 50,000 records (and growing). The funds were raised to help create a database (that will include original reviews, and will be free and searchable) and curate the existing archive - it was (is) a shitload of work, and shitloads ain't free, you know? Various incentives were offered to encourage donations, including mix tapes....giver's choice. Some enterprising (and brilliant) fukkr asked for the following tape: "one side Asia (non-Japan), one side Middle East." Well played, punk rocker. Archive coordinator Shivaun crushed this mixtape, and I hope that you will be as thrilled as I was digging on the gems she uncovered from all over the damned place. Not surprisingly, Tian An Men 89 Records makes several contributions (I mean, come on - of course that's the first place you look, right?), but Pakistani grindcore? OK....I'm sold. Bands from: Syria, India, Uzbekistan, Iran, Turkey, Singapore, Armenia, China, Indonesia, Philippines, Algeria, Israel, Nepal, Thailand, Malaysia, Sound Korea, Lebanon, Morocco and United Arab Emirates. 



              Cleaner Validation with ASP.NET MVC Model Binders & the Enterprise Library Validation Application Block   

    I accidentally stumbled across an awesome combination the other day:  using the Enterprise Library Validation Block with ASP.NET MVC.  Though I’ve played around with them a few times in the past, this is the first time I’ve really started to apply the Validation block in a serious application, and it just so happened to have an ASP.NET MVC website as its client.  My jaw dropped more and more as I started to realize the awesomeness that was unfolding before me…  hopefully this blog post will do the same (or as close as possible) to you!

    Using the Enterprise Library Validation Block

    It all started with an innocent enough Model requiring a wee bit of validation that I didn’t feel like hand-writing, so (as usual) I turned to the EntLib library to do it for me.  Applying the Enterprise Library Validation Block was surprisingly simple. 

    It all started with a simple enough class (the names have been changed to protect the innocent):

    public class Product
    {
        public int ID { get; set; }
        public string Name { get; set; }
        public string Description { get; set; }
        public double Price { get; set; }
        public int Quantity { get; set; }
    }

    This is basically just a DTO (data transfer object), but this ain’t the Wild West – there are rules, and they need to be followed!  After a few minutes, I’d come up with something like this:

    using Microsoft.Practices.EnterpriseLibrary.Validation;
    using Microsoft.Practices.EnterpriseLibrary.Validation.Validators;
    
    public class Product
    {
        [RangeValidator(
            1, RangeBoundaryType.Inclusive,             /* Lower Bound */
            int.MaxValue, RangeBoundaryType.Inclusive   /* Upper Bound */
        )]
        public int ID { get; set; }
    
        // Let's assume that we've got a field length limitation in
        // our database of 500 characters, which we'll check for here
        [StringLengthValidator(
            1, RangeBoundaryType.Inclusive,             /* Lower Bound */
            500, RangeBoundaryType.Inclusive            /* Upper Bound */
        )]
        public string Name { get; set; }
    
        // No rules for the description - anything goes!
        public string Description { get; set; }
    
        // The Price can be whatever we want, as long as it's positive
        [RangeValidator(0, RangeBoundaryType.Inclusive, double.MaxValue, RangeBoundaryType.Inclusive)]
        public double Price { get; set; }
    
        // Same deal with the Quantity - we can never have a negative quantity
        [RangeValidator(0, RangeBoundaryType.Inclusive, int.MaxValue, RangeBoundaryType.Inclusive)]
        public int Quantity { get; set; }
    
    
        public bool IsValid()
        {
            return Validate().IsValid;
        }
    
        public ValidationResults Validate()
        {
            return Validation.Validate<Product>(this);
        }
    }

    There are a couple of cool things I like about this setup:

    1. Declarative validation rules:  These rules are very explicit expression of business logic - there is no “if-else-then”, mumbo-jumbo.  In other words, there isn’t any code to worry about… and no code means no bugs (well, less bugs at least :).  Moreover, if any of these business rules change, it’s very easy to update these attributes without hunting around for that stray line of “if-else” code somewhere.  Lastly, I’ve heard talk of these mystical “business people” who are also able to read and understand simple code; and, if you run into one of these guys/gals they’ll easily be able to verify that you have the rules set properly as well.
    2. All of the validation logic is in one place:  all consumers of this class need to do is set its properties and ask the object whether or not it is valid.  There are no stray “if(string.IsNullOrEmpty(product.Name)” scattered through your code, just “if(product.IsValid())”.  I feel like this approach has a decent amount of cohesion.  Granted, it could be a bit more cohesive if we had, say, a separate “ProductValidator”, but this seems like overkill.  Regardless, it was bugging me enough that I actually created a super-class to encapsulate this logic further of the chain of inheritance and that made me feel a bit more comfortable:
    3. public cl***lfValidatingBase
      {
          public bool IsValid()
          {
              return Validate().IsValid;
          }
      
          public ValidationResults Validate()
          {
              return ValidationFactory.CreateValidator(this.GetType())
      			.Validate(this);
          }
      }
      
      public class Product : SelfValidatingBase
      {
          // ...
      }

    As with pretty much anything, there is at least one glaring drawback to this approach:  there is no “real-time” checking.  That is, this approach allows consumers to set invalid values on these validated properties at any time – possibly overwriting valid values without any checks prior to the update.  I think that as long as your application (i.e. developers) know about this limitation, it’s not so much of an issue, at least not for the scenarios I’ve used it in, so this drawback doesn’t really bother me.

    Now, let’s see how this applies to ASP.NET MVC…

    The Awesomeness that is ASP.NET MVC’s Model Binders

    When it comes to me and ASP.NET MVC’s Model Binders it was love at first site – and I haven’t stopped using them since.  In case you’re not sure what I’m talking about, here’s an example.  Instead of an Action with individual parameters and populating a new instance ourselves like this:

    [AcceptVerbs(HttpVerbs.Post)]
    public ActionResult Create(string username, string message, string userUrl)
    {
        var comment = new Comment
                          {
                              Message = message,
                              Username = username,
                              UserUrl = userUrl,
                              PostedDate = DateTime.Now
                          };
        commentsRepository.Add(comment);
        return RedirectToAction("Index");
    }

    We let the MVC framework populate a new instance for us, like this:

    [AcceptVerbs(HttpVerbs.Post)]
    public ActionResult Create(Comment comment)
    {
        commentsRepository.Add(comment);
        return RedirectToAction("Index");
    }

    I just think that’s beautiful, and so I’ve come to (over?)use Model Binders on my Controller Actions almost exclusively. 

    ASP.NET MVC Model Binders + Enterprise Library Validation Block = BFF

    The magic that I refer to at the beginning of this post first exposed itself when I inadvertently used one of my Model objects like the one I showed earlier as an Action parameter (which was really only a matter of time given the fact that I’d taken to using them so much!) using MVC’s Model Binding, and then created some validation logic for it (if you’re not sure what I’m referring to in regards to “creating validation logic”, you’ll want to check out this article on MSDN before continuing).  As I started writing my validation logic in my Action and populating the ModelState with my validation errors like so:

    [AcceptVerbs(HttpVerbs.Post)]
    public ActionResult Create(Product product)
    {
        if (!product.IsValid())
        {
            if(string.IsNullOrEmpty(product.Name))
                this.ModelState.AddModelError("name", "Please enter a product name");
            if(product.Price < 0)
                this.ModelState.AddModelError("price", "Price must be greater than 0");
            if(product.Quantity < 0)
                this.ModelState.AddModelError("quantity", "Quantity must be greater than 0");
    
            return View(product);
        }
    
    	productRepository.Add(product);
        return View("Index");
    }

    Now, even if I moved this code outside of my Action, I’d still be pretty embarrassed of it…  but after looking at it for a while I realized that I don’t have to do this after all – the EntLib ValidationResult (usually) maps perfectly to MVC’s Model Binding…  and ModelState errors!  Check out the same code, taking advantage of the EntLib validation results:

    [AcceptVerbs(HttpVerbs.Post)]
    public ActionResult Create(Product product)
    {
        var validationResult = product.Validate();
        if (!validationResult.IsValid)
        {
            foreach (var result in validationResult)
                this.ModelState.AddModelError(result.Key, result.Message);
    
            return View(product);
        }
    
    	productRepository.Add(product);
        return View("Index");
    }

     

    I added this and awesomeness ensued.  The magic comes from the fact that the Key field of the EntLib ValidationResult is the name of the property which is causing the validation error.  This leads to what I can do in line 8 above, which is just iterate through all of the validation errors and add their message to the ModelState using their Key property, which corresponds to the form id’s that we’re using to populate the model.  Just so you don’t think I’m lying, here’s what the form would look like:

    <%= Html.ValidationSummary("Create was unsuccessful. Please correct the errors and try again.") %>
    <% using (Html.BeginForm()) {%>
        <fieldset>
            <legend>Add New Product</legend>
            <p>
                <label for="Name">Name:</label>
                <%= Html.TextBox("Name") %>
                <%= Html.ValidationMessage("Name", "*") %>
            </p>
            <p>
                <label for="Description">Description:</label>
                <%= Html.TextBox("Description") %>
                <%= Html.ValidationMessage("Description", "*") %>
            </p>
            <p>
                <label for="Price">Price:</label>
                <%= Html.TextBox("Price") %>
                <%= Html.ValidationMessage("Price", "*") %>
            </p>
            <p>
                <label for="Quantity">Quantity:</label>
                <%= Html.TextBox("Quantity") %>
                <%= Html.ValidationMessage("Quantity", "*") %>
            </p>
            <p>
                <input type="submit" value="Create" />
            </p>
        </fieldset>
    <% } %>

    I Think We Can Do Just a Bit Better…

    So, there you have it – easy validation using ASP.NET MVC Model Binders, MVC’s Validation components, and Enterprise Library’s Validation block.  The preceeding should work like a charm, but me being the perpetual perfectionist and idealist saw one more piece of duplication that I wanted to remove.  Namely, the foreach loop used to map the ValidationResults to the ModelState.  Using an extension method to extend the ValidationResults class, this duplication can easily be removed like so:

    using System.Web.Mvc;
    using Microsoft.Practices.EnterpriseLibrary.Validation;
    
    public static class EntLibValidationExtensions
    {
        public static void CopyToModelState(this ValidationResults results, ModelStateDictionary modelState)
        {
            foreach (var result in results)
                modelState.AddModelError(result.Key ?? "_FORM", result.Message);
        }
    }

    Now the previous Action looks just a bit cleaner:

    [AcceptVerbs(HttpVerbs.Post)]
    public ActionResult Create(Product product)
    {
        var validationResult = product.Validate();
        if (!validationResult.IsValid)
        {
            validationResult.CopyToModelState(this.ModelState);
            return View(product);
        }
    
        productRepository.Add(product);
        return View("Index");
    }

    And with that, I’m happy…  What do you think??


              Easier Automated Database Testing with SQL Express   

    Scenario

    I've got a project in which I actually have full create scripts for my database such that I can build a whole new instance from the bottom up.  I've also got some automated unit/integration tests that I want to run against this database, complete with a bunch of scripts that can build some test data for me (unrealistic, I know...  but bear with me :).  Also, I really don't want to have to worry about configuring connection strings just for my tests - I just want some database available to me when I need it that I can wail on with requests and gets cleaned up for me when I'm done.  Finally, I want to keep my tests as isolated as possible, which to me means a file-based SQL Express database; that way, I can attach, detach, and delete as much as I want with as little exposure and impact to the rest of my build system as possible.

    Solution

    My solution to the above scenario I found myself in was to create a helper class called TestDatabase whose job is to give me a database when I need one, provide me with a clean version of my test data before each test I run, and clean up after me when I'm done.  To this end, I started searching for how to create a file-based SQL Express database using code, and came up with Louis DeJardin's great blog post that walked me right though it.  After I had that, it was a simple matter of whipping up the class, shown below (Note: this is only a partial listing.  You can get the full listing from my code repository):

    TestDatabase.cs (partial listing)
    public class TestDatabase : IDisposable
    {
        private readonly string connectionString;
        private readonly string databaseFilename;
    
        public string ConnectionString { get { return connectionString; } }
        public string Schema { get; set; }
        public string TestDataScript { get; set; }
    
        public TestDatabase(string databaseFilename, string schema, string testData)
        {
            this.databaseFilename = databaseFilename;
            connectionString = string.Format(
                @"Server=.\SQLEXPRESS; Integrated Security=true;AttachDbFileName={0};",
                Path.GetFullPath(databaseFilename));
            Schema = schema;
            TestDataScript = testData;
        }
    
        public void Dispose()
        {
            DeleteDatabaseFiles();
        }
    
        public void RecreateTestData()
        {
            EnsureDatabaseCreated();
    
            if (!string.IsNullOrEmpty(TestDataScript))
                ExecuteQuery(TestDataScript);
        }
    
        // Create a new file-based SQLEXPRESS database
        // (Credit to Louis DeJardin - thanks! http://snurl.com/5nbrc)
        protected void CreateDatabase()
        {
            var databaseName = Path.GetFileNameWithoutExtension(databaseFilename);
    
            using (var connection = new SqlConnection(
                "Data Source=.\\sqlexpress;Initial Catalog=tempdb;" +
                "Integrated Security=true;User Instance=True;"))
            {
                connection.Open();
                using (var command = connection.CreateCommand())
                {
                    command.CommandText =
                        "CREATE DATABASE " + databaseName +
                        " ON PRIMARY (NAME=" + databaseName +
                        ", FILENAME='" + databaseFilename + "')";
                    command.ExecuteNonQuery();
    
                    command.CommandText =
                        "EXEC sp_detach_db '" + databaseName + "', 'true'";
                    command.ExecuteNonQuery();
                }
            }
    
            // After we've created the database, initialize it with any
            // schema we've been given
            if (!string.IsNullOrEmpty(Schema))
                ExecuteQuery(Schema);
        }
    }

    Let's analyze the things we've got going on here:

    1. First, we've got the CreateDatabase() method (lines 34-61) - basically ripped right from Louis's blog post linked above - which does the magic of creating a file-based SQL Express database.  It all boils down to a "CREATE DATABASE" and "EXEC sp_detach_db" call on the local SQL Express instance's tempdb database, which everyone has access to.  Then when that's all done, I execute the schema script that the tester passed in to build the database schema and finish the initial setup.
    2. Now that the database has been created and initialized with its schema, we can run some tests against it!  Problem is, at this point it's just an empty database...  Fortunately for us, we've got the RecreateTestData() method, which just executes the TestDataScript against the current database, allowing us to easily populate whatever test data we want!  This script should include everything it needs to clean out the database and rebuild it from scratch with a new set of clean data.
    3. Built-in connection string management.  As you can see, our constructor takes in a database filename, builds a connection string out of it, and then exposes that connection string to our testers via a read-only property.  That is one less connection string that our test project has to worry about managing in its app.config (or whatever), which is pretty nice and clean, IMHO!
    4. Finally, our big finale:  cleaning up after ourselves!  You can see that TestDatabase implements IDisposable, allowing us to create a Dispose() method which cleans up after everything we've done - namely, deleting the database files we've created along the way.  This means that after everything is said and done, we've left not one footprint of our presence on the build system.

    Now, after we've got our TestDatabase class available, our unit tests become as easy as this:

    public void SomeCoolDatabaseDrivenServiceTest()
    {
        var mySchema = System.IO.File.ReadAllText("mySchema.sql");
        var testData = System.IO.File.ReadAllText("testData.sql");
        using (var db = new TestDatabase("TestDatabase.mdf", mySchema, testData))
        {
            db.Initialize();
            var service = new MyService(db.ConnectionString);
            service.DoSomething();
        }
    }

    Of course, individual tests can have even less code if you manage the test database outside of the test by using your test framework's setup and teardown methods.  For example, if I had a whole slew of tests against the same database (which is usually always the case), the test class would start out like this:

    TestDatabase database;
    
    public void ClassInitialize()
    {
        var mySchema = System.IO.File.ReadAllText("mySchema.sql");
        var testData = System.IO.File.ReadAllText("testData.sql");
        database = new TestDatabase("TestDatabase.mdf", mySchema, testData);
        database.Initialize(true);
    }
    
    public void TestInitialize()
    {
        // Rebuild the test data from scratch before EVERY test
        database.RecreateTestData();
    }
    
    public void ClassCleanup()
    {
        database.Dispose();
    }

    Now that we have all of that setup and teardown logic out of the way, we can focus on what we're actually testing, so then that test I showed you before becomes a simple one-liner (as it would have been if we were just passing in a connection string from a configuration file):

    public void SomeCoolDatabaseDrivenServiceTest()
    {
        // No TestDatabase setup - just use its connection string!
        var service = new MyService(database.ConnectionString);
        service.DoSomething();
    }
    What's cool about this is that not only do we not have to worry about where to get our connection string from, our entire suite of test data is also being rebuilt for us before every test is run!

    Try It Out For Yourself!

    If you like what you've seen in this post and want to try it out for yourself, you can grab the full source file (complete with in-line comments and unit tests) from my repository: TestDatabase.cs.  Just drop it in your project and start using it! Note: The full source file has unit tests included. If you don't want them, you can simply delete them without affecting the main class.

    As always, I'd love to hear your comments and feedback on all this.  If you've found this useful or - better yet - if you have a better way of doing it, please let me know!


              Data Entry Clerk - U. S. Port Services - Savannah, GA   
    Required license or certification:. Maintain customer database’s and process invoices with high attention to detail • Must have strong computer skills with...
    From Indeed - Tue, 27 Jun 2017 21:44:08 GMT - View all Savannah, GA jobs
              Wildlife Biologist II – Baffin - GOVERNMENT OF NUNAVUT - Pond Inlet, NU   
    Applied knowledge of statistical procedures, applications, data tabulation, computer applications coupled with the ability to establish databases and geographic... $97,734 a year
    From Indeed - Fri, 17 Mar 2017 19:01:25 GMT - View all Pond Inlet, NU jobs
              How to know who your LIVE competitor is   
    I ve seen a lot of tools rely on Alexa database to give you some sort of information like rank, pageviews and visits to a site, I still do not know how reliable is alexa data to know who your competitor is because Alexa does not have a complete database on all the websites but […]
              Receptionist/Administrative Assistant - Hiring and Empowering - Olean, NY   
    Operate &amp; own the firm CRM database with precision and excellence. All facets of position involve helping a growing Estate Planning Law firm design a path to...
    From Indeed - Tue, 27 Jun 2017 20:26:19 GMT - View all Olean, NY jobs
              Host System Administrator - Silver State Schools Credit Union - Las Vegas, NV   
    Assume responsibility as the primary interface responsibility between SSSCU and core Jack Henry and Associates, financial database provider, for routine problem...
    From Silver State Schools Credit Union - Thu, 06 Apr 2017 07:49:59 GMT - View all Las Vegas, NV jobs
              How true is grit? Assessing its relations to high school and college students’ personality characteristics, self-regulation, engagement, and achievement.   
    Duckworth, Peterson, Matthews, and Kelly (2007) defined grit as one’s passion and perseverance toward long-term goals. They proposed that it consists of 2 components: consistency of interests and perseverance of effort. In a high school and college student sample, we used a multidimensional item response theory approach to examine (a) the factor structure of grit, and (b) grit’s relations to and overlap with conceptually and operationally similar constructs in the personality, self-regulation, and engagement literatures, including self-control, conscientiousness, cognitive self-regulation, effort regulation, behavioral engagement, and behavioral disaffection. A series of multiple regression analyses with factor scores was used to examine (c) grit’s prediction of end-of-semester course grades. Findings indicated that grit’s factor structure differed to some degree across high school and college students. Students’ grit overlapped empirically with their concurrently reported self-control, self-regulation, and engagement. Students’ perseverance of effort (but not their consistency of interests) predicted their later grades, although other self-regulation and engagement variables were stronger predictors of students’ grades than was grit. (PsycINFO Database Record (c) 2017 APA, all rights reserved)
              Math self-concept, grades, and achievement test scores: Long-term reciprocal effects across five waves and three achievement tracks.   
    This study examines reciprocal effects between self-concept and achievement by considering a long time span covering grades 5 through 9. Extending previous research on the reciprocal effects model (REM), this study tests (1) the assumption of developmental equilibrium as time-invariant cross-lagged paths from self-concept to achievement and from achievement to self-concept, (2) the generalizability of reciprocal relations when using school grades and standardized achievement test scores as achievement indicators, and (3) the invariance of findings across secondary school achievement tracks. Math self-concept, school grades in math, and math achievement test scores were measured once each school year with a representative sample of 3,425 German students. Students’ gender, IQ, and socioeconomic status (SES) were controlled in all analyses. The findings supported the assumption of developmental equilibrium for reciprocal effects between self-concept and achievement across time. The pattern of results was found to be invariant across students attending different achievement tracks and could be replicated when using school grades and achievement test scores in separate and in combined models. The findings of this study thus underscore the generalizability and robustness of the REM. (PsycINFO Database Record (c) 2017 APA, all rights reserved)
              In peer matters, teachers matter: Peer group influences on students’ engagement depend on teacher involvement.   
    This study focused on the joint effects of teachers and peer groups as predictors of change in students’ engagement during the first year of middle school, when the importance of peer relationships normatively increases and the quality of teacher–student relationships typically declines. To explore cumulative and contextualized joint effects, the study utilized 3 sources of information about an entire cohort of 366 sixth graders in a small town: Peer groups were identified using sociocognitive mapping; students reported on teacher involvement; and teachers reported on each student’s engagement. Consistent with models of cumulative effects, peer group engagement and teacher involvement each uniquely predicted changes in students’ engagement. Consistent with contextualized models suggesting differential susceptibility, peer group engagement was a more pronounced predictor of changes in engagement for students who experienced relatively low involvement from teachers. These peer effects were positive or negative depending on the engagement versus disaffection of each student’s peer group. Person-centered analyses also revealed cumulative and contextualized effects. Most engaged were students who experienced support from both social partners; steepest engagement declines were found when students affiliated with disaffected peers and experienced teachers as relatively uninvolved. High teacher involvement partially protected students from the motivational costs of affiliating with disaffected peers, and belonging to engaged peer groups partially buffered students’ engagement from the effects of low teacher involvement. These findings suggest that, although peer groups and teachers are each important individually, a complete understanding of their contributions to students’ engagement requires the examination of their joint effects. (PsycINFO Database Record (c) 2017 APA, all rights reserved)
              It’s all a matter of perspective: Viewing first-person video modeling examples promotes learning of an assembly task.   
    The present study tests whether presenting video modeling examples from the learner’s (first-person) perspective promotes learning of an assembly task, compared to presenting video examples from a third-person perspective. Across 2 experiments conducted in different labs, university students viewed a video showing how to assemble an 8-component circuit on a circuit board. Students who viewed the assembly video recorded from a first-person perspective performed significantly better than those who viewed the video from a third-person perspective on accuracy in assembling the circuit in both experiments and on time to assemble the circuit in Experiment 1, but not in Experiment 2. Concerning boundary conditions, the perspective effect was stronger for more complex tasks (Experiment 1), but was not moderated by imitating the actions during learning (Experiment 1) or explaining how to build the circuit during the test (Experiment 2). This work suggests a perspective principle for instructional video in which students learn better when video reflects a first-person perspective. An explanation based on embodied theories of learning and instruction is provided. (PsycINFO Database Record (c) 2017 APA, all rights reserved)
              Can collaborative learning improve the effectiveness of worked examples in learning mathematics?   
    Worked examples and collaborative learning have both been shown to facilitate learning. However, the testing of both strategies almost exclusively has been conducted independently of each other. The main aim of the current study was to examine interactions between these 2 strategies. Two experiments (N = 182 and N = 122) were conducted with Grade-7 Indonesian students, comparing learning to solve algebra problems, with higher and lower levels of complexity, collaboratively or individually. Results from both experiments indicated that individual learning was superior to collaborative learning when using worked examples. In contrast, in Experiment 2, when learning from problem solving using problem-solving search, collaboration was more effective than individual learning. However, again in Experiment 2, studying worked examples was overall superior to learning from solving problems, particularly for more complex problems. It can be concluded that while collaboration could be beneficial when learning under problem solving conditions, it may be counterproductive when studying worked examples. (PsycINFO Database Record (c) 2017 APA, all rights reserved)
              Developmental change in the influence of domain-general abilities and domain-specific knowledge on mathematics achievement: An eight-year longitudinal study.   
    The contributions of domain-general abilities and domain-specific knowledge to subsequent mathematics achievement were longitudinally assessed (n = 167) through 8th grade. First grade intelligence and working memory and prior grade reading achievement indexed domain-general effects, and domain-specific effects were indexed by prior grade mathematics achievement and mathematical cognition measures of prior grade number knowledge, addition skills, and fraction knowledge. Use of functional data analysis enabled grade-by-grade estimation of overall domain-general and domain-specific effects on subsequent mathematics achievement, the relative importance of individual domain-general and domain-specific variables on this achievement, and linear and nonlinear across-grade estimates of these effects. The overall importance of domain-general abilities for subsequent achievement was stable across grades, with working memory emerging as the most important domain-general ability in later grades. The importance of prior mathematical competencies on subsequent mathematics achievement increased across grades, with number knowledge and arithmetic skills critical in all grades and fraction knowledge in later grades. Overall, domain-general abilities were more important than domain-specific knowledge for mathematics learning in early grades but general abilities and domain-specific knowledge were equally important in later grades. (PsycINFO Database Record (c) 2017 APA, all rights reserved)
              Working memory strategies during rational number magnitude processing.   
    Rational number understanding is a critical building block for success in more advanced mathematics; however, how rational number magnitudes are conceptualized is not fully understood. In the current study, we used a dual-task working memory (WM) interference paradigm to investigate the dominant type of strategy (i.e., requiring verbal WM resources vs. requiring primarily visuospatial WM resources) used by adults when processing rational number magnitudes presented in both decimal and fraction notation. Analyses revealed no significant differences in involvement of verbal and visuospatial WM, regardless of notation (fractions vs. decimals), indicating that adults rely upon a mix of strategies and WM resources when processing rational number magnitudes. However, this pattern interacted with algebra ability such that those performing better on the algebra assessment relied upon both verbal and visuospatial WM when engaging in rational number comparisons, whereas rational number performance by adults with low algebra fluency was affected only by a simultaneous verbal WM task. Together, results support previous work implicating the involvement of WM resources in rational number processing and is the first study to indicate that the involvement of both verbal and visuospatial WM, as opposed to relying primarily on verbal WM, when processing rational number magnitudes may be indicative of higher mathematical proficiency in the domain of algebra. (PsycINFO Database Record (c) 2017 APA, all rights reserved)
              Phonological processing in children with specific reading disorder versus typical learners: Factor structure and measurement invariance in a transparent orthography.   
    Although children with specific reading disorder (RD) have often been compared to typically achieving children on various phonological processing tasks, to our knowledge no study so far has examined whether the structure of phonological processing applies to both groups of children alike. According to Wagner and Torgesen (1987), phonological processing consists of 3 distinct constructs: phonological awareness (PA), rapid automatized naming (RAN), and the phonological loop (PL) of working memory. The present study examined whether this phonological processing model which was originally developed for English orthography is also applicable to a more transparent language such as German. Furthermore, we tested whether the structure of phonological processing is invariant across typically achieving children and children with RD. Therefore, 209 German-speaking 3rd graders (100 typical learners and 109 children with RD) completed a comprehensive test battery assessing PA, RAN, and PL. Using confirmatory factor analyses, we compared the latent structure of these phonological processing skills across both groups. The study yielded 3 important findings: First, Wagner and Torgesen’s (1987) model transfers to the German language and its orthography with transparent grapheme-to-phoneme correspondences. Second, the tripartite structure of phonological processing was evident across both groups (factorial invariance). Third, group invariance was also found for the measurement and structural components of the model (measurement invariance). These findings suggest that the nature of phonological processing is invariant across typically achieving children and children with RD acquiring the transparent orthography of German. Theoretical and practical implications are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved)
              Peer influence on children’s reading skills: A social network analysis of elementary school classrooms.   
    Research has found that peers influence the academic achievement of children. However, the mechanisms through which peers matter remain underexplored. The present study examined the relationship between peers’ reading skills and children’s own reading skills among 4,215 total second- and third-graders in 294 classrooms across 41 schools. One innovation of the study was the use of social network analysis to directly assess who children reported talking to or seeking help from and whether children who identified peers with stronger reading skills experienced higher reading skills. The results indicated that children on average identified peers with stronger reading skills and the positive association between peer reading skills and children’s own reading achievement was strongest for children with lower initial levels of reading skills. The study has implications for how teachers can leverage the advantages of peers via in-class activities. (PsycINFO Database Record (c) 2017 APA, all rights reserved)
              Aeronautical Data Quality Engineer   
    MD-Lexington Park, Provide Subject Matter Expertise and Systems Analysis for quality of aeronautical databases supporting CNS/ATM RNP RNAV flight. Essential Job Functions: Provide system analysis and requirements analysis on data compliance with civil and military CNS/ATM aeronautical data quality standards and requirements. Provide systems analysis and requirements analysis on compliance with military and civil equ
              Lowongan Kerja PT Netwave Teknologi   
    Lowongan Kerja PT Netwave Teknologi : Founded by experienced team from the telecomunication industries more than 10 years, PT Netwave Teknologi is your partner for business solution, especially for telecomunication industries to give you reliable products, services and solution.

    The telecom business is one side spruced by rapid growth, fast changes in technology, major trend towards deregulation and globalization causing division of the market segment and making it more specialized. On the other side, it suffers from lack of open standards, proprietary systems, and inflexible and diverse applications

    Thus to stay in business, operators have to provide cheaper and faster service at a higher quality level and also think about connecting to partners networks to provide business synergy. The challenge is to efficiently manage ones own networks and seamlessly be able to integrate and control multi-vendor and multi-technology products.

    If you’re a hardworking, motivated, and team-oriented person, Netwave Teknologi would like to talk to you! We offer competitive and attractive compensation (based on expertise level). We strive to provide a safe and enjoyable working environment.

    PT Netwave Teknologi (www.netwave-tek.com) is urgently looking for candidates for Telecommunication Project who possess following requirements:


    Project Management Officer (PMO) Work Location Depok

    Role Responsibilities :

    * To take on the end-to-end management of both project management and product development.
    * Creating project plan, project schedule, execute, monitor and control the project planning keep the project time frame consistent (complete project management officer steps and tasks).
    * Work with key internal and external stakeholders to define and scope projects; determine resources required, highlight risks, set clear objectives and delivering high quality project goal within limited time, budget and resources, including documents gathering from internal and external related with project.
    * Work interfaces with all team project related include business development and marketing team, development team, vendors, solution development, quality assurance and client support.
    * Supervise and motivate the team members, verify status and assigned activities, participate to internal project steering comities and follow up actions.

    Requirement

    * Bachelor degree, preferably in Information technology or Computer Science.
    * Minimum of 2 years software project management experience. Preferably in Telco industry.
    * Having knowledge in Application Software Engineering, RDBMS and Infrastructure.
    * Having experience of handling multiple projects under tight timescales at the same time (multi tasking capability).
    * Highly developed communication skills (Presentation, Interpersonal and Influencing others Skill).
    * Capable to work under pressure and long working hours.
    * Excellent analytical and decision-making skills.
    * Adaptive of very rapid changes situation.
    * Initiative, Creative, High Commitment and Struggle.
    * Creative to find out best way out for any problem of situation (especially for non-technical issues).

    System Development (SysDev) Work Location Depok

    Role Responsibilities

    Propose solution or ideas from technical point of view in technical design discussion.

    * Develop / construct application or modules based on technical design by using defined framework and technical environment.
    * As second level support in term of problem solving.

    Requirements :

    * Strong analytical skills.
    * Good in communication skills and high working capability in teamwork
    * Excellent knowledge in Java Fundamental, J2EE, J2ME and its related technology (Application Server, EJB, framework, etc)
    * Good knowledge in Database (Oracle, mySQL)
    * Good knowledge in System Design using UML
    * Experienced in Web Based System Development
    * Experienced in Mobile Application Development is desired.
    * Familiar with versioning system (SVN)
    * Knowledge in other programming language (C/C++, VB, Delphi) will be a plus

    To apply for this position, you are welcome to submit resume with contact details, attached a photo and expected salary by email to :

    hrd.recruit@netwave-tek.com
    Please indicate the position code you are applying for in the subject of your email. Only short listed candidates will be notified.


    Lowonagan Kerja Sampoerna 2011
              Database & Data Management Account Executive - SAP - Palo Alto, CA   
    Possess hands-on knowledge of SAP, Sybase, Oracle, IBM, Microsoft, Teradata, Informatica, MapR, Cloudera, Hortonworks or other associated database and data...
    From SAP - Tue, 13 Jun 2017 02:43:05 GMT - View all Palo Alto, CA jobs
              Michelle Breyer   
    In honor of this week’s launch of Curl Gloss, our new lightweight glossing gel that shapes, shines and offers ultra-hydrating control without crunch, we wanted to dive deeper into the curl community. Who better to ask than its unofficial leader, Michelle Breyer, co-founder of NaturallyCurly.com? We talked to Breyer about how her mega successful website came to be, why curly girls are so passionate and some tips for curls that she’s picked up along the way.

    How did the idea for Naturally Curly come about?


    We started NaturallyCurly.com 16 years ago, before blogs, social media, etc. We were frustrated with our hair, and knew there were others who felt the same, but there were so few curl-focused stylists and products and even fewer curly haired models in magazines to learn from. Someone overheard us complaining about our hair at brunch and suggested we start a website. My neighbor’s 13-year-old son created the site. We started with a session board that people could use to share their thoughts, tips and more. Then we started posting product and stylist reviews. The stylist reviews were such a key part of it because finding stylist who can work with curly hair is essential.

    The site evolved out of the community, which was extremely active. Approximately 60 percent of the population has some texture to their hair, and they weren’t being heard before. Our boards were the core of the site. We now have about 11 million unique visitors to our sites (which also include CurlyNikki, CurlMart and CurlStylist) and social media.

    Why was it so important for you to help build the curl community?


    The curl community can really help each other feel good about themselves. It’s so much about support. I didn’t feel like I had that growing up because my mom and sister had straight hair and couldn’t relate to me. Everyone around me in California had Farrah Fawcett hair—except me.

    Why do you think the curl community is so passionate?


    It truly is a unique community. People have self-image issues that they bond over; they have shared experiences. Curly hair is always a struggle. With curly hair, you can use the same product every day and your hair will look different every day—it has its own personality!

    There are so many nuances to curly hair. In recent years, we’ve seen a lot of women of color transitioning from relaxers to natural hair because it’s becoming so much more acceptable. Other women blew out their hair every day and are now learning to work with their natural texture. Our members like to give and get advice.


    How have you seen the curl community change over the last several years?


    It’s a completely different world than when we started the site. There are so many more options product options, and pop culture accepts so many more standards of beauty. It’s now more of a multicultural society with more texture. Social media has also really helped. Before, we had to rely on magazines and movie stars telling us what was pretty; normal people weren’t sharing what they thought was pretty. Social media has really created an open dialogue and given us access to so much inspiration.
    I’ve seen an increased number of hairstylists becoming passionate about working with texture, but there’s still a long way to go! Curls are not a trend or fad—this is a 365-day-a-year thing.

    What are some of the most common discussion topics on the site?


    Some of the most popular topics include hair growth, hair breakage, transitioning to natural hair, frizz, adding moisture and keratin treatments to loosen curls and make them more defined and predictable. We also have moms who don’t have curly hair but have curly kids and women who suddenly get curls after chemotherapy. People are also really into ingredients—they’re very savvy, educated and curious.

    Your website focuses a lot on texture types. How have you defined those?


    Inspired by Andre Walker's hair types, our Texture Typing system details the varieties of wavy, curly and coily hair to give people a starting point to figure out what kind of products they should use. Along with curl pattern, porosity, density, width and length of your hair play a part as well. We help you see products other people with hair like yours have used.

    We have a “test” visitors can take. Curly Spirally (3B) is most common on the site, followed by Curly Twirly (3a). We have quite a few people with wavy hair as well—they struggle because they go back and forth between texture and straight and don’t know how to work with their wave.

    You started CurlStylist a few years ago. Why is this such a big deal?


    Most beauty schools still don’t train hairdressers how to work with curls, but it’s important for stylists to get educated! You want to feel comfortable with curls; they are a totally different animal than straight hair. There are so many unique techniques for curls, and you have to know how the hair will react with shrinkage and lack of uniformity.

    Curly haired clients will be your best friends—they are extremely loyal!


    What are some of the top curly hair tips you’ve picked up over the years?


    • Never use a towel to dry your hair; always use t-shirt, which is gentler.
    • A good diffuser is a must.
    • Don’t brush your hair when it’s dry.
    • Don’t fight the weather—if it’s going to be a muggy day, choose a style that will work with the humidity.
    • Always have a Plan B (a bun is my best friend!).
    • Find a stylist who knows how to work with curly hair and feels comfortable with it. A lot of salons have curl experts within them, so always ask. We have salon review database for this reason.

    Your company has done a lot of research on the curly hair community. What are some of the key insights you’ve learned?


    • 85% of people with curly hair say they are more likely to embrace their natural texture than they were five years ago
    • 50% of coily consumers have 11 or more hair care products in their house, compared with 20% of straight-haired consumers
    • 64% of coily consumers say they are much less likely to use a relaxer today vs. a year ago
    • When purchasing a styling product, definition is the most important attribute for curly consumers, followed by moisture
    • 65% of curly consumers cocktail hair care proucts
    • Curly girls willing to travel three or more hours to go to a stylist that specializes in curls



              Senior Teradata Database Administrator   
    UT-Midvale, Senior Teradata Database Administrator Senior Teradata Database Administrator (Overstock.com, Inc., Midvale, UT) Multiple openings available. Assist with providing guidance on projects incl physical database implementation, application/SQL performance tuning, security administration, & infrastructure process development & enhancement. Provide support for Enterprise Data Warehouse. Min Reqs: Bachel
              (USA-OR-Portland) Staffing Specialist - CCS Home Health - On-Call   
    Coordinates, facilitates and manages 24-hour staff visit scheduling process of the Continuing Care Services Department for Home Health and Hospice, Home Infusion Nursing, Palliative Care, and Physical Medicine to meet identified operational objectives in a manner consistent with department policies, union contracts, operational efficiency and mission. Makes possible the delivery of most effective, timely, quality care to the homebound patient. Essential Functions: - Manages staffing/fulfillment/patient needs of the CCS Department by coordinating resources with the needs of the CCS Department by scheduling staff for home visits in a manner that makes possible maximum staff productivity and cost control. - Matches patient needs and patient location with staff availability, skill, workload, union seniority and geographical assignments. Initiates supervisor contact to augment or reduce staff as necessary. - Acts as a resource to supervisors, managers, directors, and support staff to meet informational needs. - Analyzes, assesses, projects, and produces accurate and timely staff schedules to ensure appropriate staff/patient coverage. - Follows appropriate regional and departmental policies and procedures, Local 49 and OFN union contracts. - Effectively maintains daily accurate computerized records of admissions, discharges, visits needed, visits scheduled, staff availability and skills, visit allocation system points, and statistical data. - Prepares daily statistical reports. - Facilitates staff/patient visit scheduling that minimizes professional staff overtime and mileage, outside referrals and unnecessary visits while maximizing staff productivity and operational efficiency. - Facilitates communication between physicians, patients/families, department supervisors and managers, Home Health/Hospice, Home Infusion, Palliative Care, and Physical Medicine staff using pagers, written messages, computers and voice mail. - Effectively communicates with clinical supervisors to coordinate staff/patient visit needs. - Makes decisions without prior approvals, which include creating solutions, problem solving/negotiating, conflict resolution, and setting personal work priorities. - Informs appropriate staff of patient discharge or death in a timely manner. - Participates in departmental quality management activities including performance measurement of scheduling processes. Qualifications: Basic Qualifications: - Two (2) years of experience in a staff scheduling position or equivalent knowledge/experience. - Equivalent experience in a closely related field (customer service, computerized scheduling, scheduling software, patient-based computer application, dispatching, accounting). - High school diploma or GED. - Working knowledge of personal computers, database software and adapt them to meet departmental needs. - Conflict resolution/negotiation skills. - Basic mathematical skills. - Geography of northwest Oregon and southwest Washington. - Basic knowledge of a Health care delivery system. - Demonstrates customer-focused service skills. - Able to analyze complex problems. - Able to manage multiple tasks and conflicting demands for service. - Organizational, patter recognition and process analysis skills. - Skills in problem solving/critical thinking and prioritization to meet a fast-paced, ever-changing environment. - Able to make and communicate decisions and take action in high-pressure situations with tact, diplomacy and respect. - Demonstrated effective communication skills and strong attention to detail. - Ability to accurately process a large quantity of data despite distractions and interruptions. - Demonstrated excellent attendance record. Preferred Qualifications: - Two (2) years of recent experience in scheduling or closely related field preferred. - Previous experience in design, coordination and maintenance of staff schedules preferred. - Two (2) to three (3) years of experience working with Kaiser Permanente operations preferred. - Home Health/Hospice scheduling experience preferred. - Associate's degree in communications, bookkeeping or accounting or related knowledge and experience preferred. - Thorough knowledge of home health/hospice and Kaiser Permanente operations preferred. - Working knowledge of HMO philosophy and mission, computerized scheduling systems/software, voice mail, word processing, database software, productivity measurements and continuous quality improvement preferred. - Working knowledge of KARE system preferred. - Thorough knowledge of union (Local 49, OFN) labor contracts, and home health/hospice operations preferred. - Medical Terminology preferred. - 10-kay calculator skills preferred. **COMPANY** *Kaiser Permanente***TITLE** *Staffing Specialist - CCS Home Health - On-Call***LOCATION** *Portland, OR***REQNUMBER** *610374* External hires must pass a background check/drug screen. Qualified applicants with arrest and/or conviction records will be considered for employment in a manner consistent with Federal, state and local laws, including but not limited to the San Francisco Fair Chance Ordinance. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, protected veteran, or disability status.
              Comment on CAMERAPEDIA!: The Equipment Database by Ming Thein   
    HE - haha! Summilux issues: asymmetry, outright soft, aperture blades detached, tight spots in focus throw...I hear QC is much better now, but still check well (especially at that price...)
              Comment on Did Obama “Choke” When Kremlin Attacked Our Election? by Thoughtopsy   
    Its true that they don't have a leader right now. But at least they have principles they believe in and stick to: Single payer. Better opportunities for all. Free education. Higher taxes. Less poverty. Less racism. Less sexism. More equality. Equal rights and application of the law... etc Whereas President Angry 5th Grade P*ssy-Grabber and a tax grab bill with a 17% approval rating makes a mockery of any argument that the right still has any. Today's Example: The right has always been against a national gun registry. Because its bigger government. Reduces their "Freedom". Puts their personal private information in the hands of the Feds.... yada yada yada.... Until... THEY decide they want a national voter database. Which increases the size of government, and reduces your freedom by giving all your private information to the Feds.... And there it is... No principles they won't break. No beliefs they won't p*ss on. Hypocritical to the core.
              Comment on States Refuse To Hand Over Private Voter Data To Trump Panel by Thoughtopsy   
    Right Wingers then: "NO, YOU CAN'T ADD ME TO A NATIONAL GUN LICENSE DATABASE... THAT INFRINGES MY FREEDOM WITH BIG GOVERNMENT DATA COLLECTION OF MY PRIVATE INFORMATION. THAT'S WRONG!!!" Right Wingers now: "Of course we're creating a national database of all voters and their private information including addresses, political affiliations and social security numbers.... what's wrong with that???" W. T. F???
              Is It Time To Specialize?   

    Originally posted on: http://ferventcoder.com/archive/2013/06/18/is-it-time-to-specialize.aspx

    ComputerWorker

    Over my career I have made a living as a generalist.  I have been a jack of all trades and a master of none.  It has served me well in that I am able to move from one technology to the other quickly and make myself productive.  Where it becomes a problem is deep knowledge.  I am constantly digging for the things that aren’t basic knowledge.  How do you make a product like WCF or Windows RT do more than just “Hello World”?

    As an architect I need to be a jack of all trades.  This is what helps me to bring the big picture of a project into focus for developers with different skills to accomplish the goals of the project. It is a key when the mix technologies crosses Windows, Unix and Mainframe with different languages and databases.  The larger the company that the project is for the more likely this scenario will arise.

    As a consultant and a developer I need to have specialized skills in order to get the job done efficiently.  if I have a SharePoint or Windows Phone project knowing the object model details and possible roadblocks of the technology allow me to stay within budgets as well as better advise the client on technology decisions.

    What is the solution?  Constant learning and associating with developers who specialize in a variety of technologies is the best thing you can do.  You may have thought you were done with classes when you left college, but in this industry you need to constantly be learning new products and languages.  The ultimate answer is you must generally specialize.  Learn as many subject areas as possible, but go deep when ever you can.  Sleep is overrated.  Good luck.


              Style Seekers   
    When fashion editor and stylist Elisa Goodkind went to re-enter the fashion industry after a long hiatus to raise her children, she found that a major shift had occurred: the creativity and passion of the ‘80s and ‘90s had been largely replaced by consumer-focused styles that lacked originality. Aiming to showcase the people who were still taking risks and using fashion to express their individuality, Elisa teamed up with her daughter, Lily Mandelbaum, to create Stylelikeu, a video-based website that takes readers into the homes and closets of people whose style they find interesting and inspiring. The subjects include designers, musicians, performance artists, teachers and people from all walks of life around the world.

    Oribe Hair Care is partnering with Stylelikeu to bring you these profiles—and go a little more in-depth with each subject’s beauty routine. Check back each week for a new video and interview.

    To kick off this series, we talked to Elisa and Lily to find out a little bit more about the site and their own personal styles.


    Who are some of the most memorable people you’ve profiled?


    Lily & Elisa: Fatima Robinson. She is a hugely successful hip-hop choreographer who lives in LA. Her house was one of those where we entered and wondered how on earth we were going to photograph everything within the next three hours! From the art to the clothes to the books, it was just amazing. For someone who is so renowned and accomplished, she was so humble and so open. There was no pretense; she just let it all out there. She even danced for us. It was very clear that her ability to be open about her experiences and be open to new experiences is what has made her so successful. She was also having so much fun with her clothes and was so excited to share her obsession with adorning herself in beautiful things—it seemed almost equal to her love for dance. She was oozing passion everywhere.

    Elisa: Ilona Royce Smithkin: Ilona is 95 years old, and I was so struck by her vivaciousness. When we showed up at her one bedroom apartment—the same one she’s lived in for 50-some years—it was packed floor-to-ceiling with her life…paintings, books, clothes, everything. In the midst of shooting her, she began to draw one of my eyes, all the while telling me exactly who I was as a person and describing all my characteristics perfectly. The drawing was a 100 percent accurate as well. She's an amazing spirit, and I can only hope to be so strong and open at her age.

    Lily: Oliver & Kira: On our first Stylelikeu shooting trip to London, we had no free time scheduled at all. At the last minute, somebody cancelled their shoot and we decided to take advantage of the couple hours we had free and go to this flea market someone had recommended. We were walking through and, all of the sudden, I saw this girl and said, "Holy shit, we need to scout her!" She was totally in her own world—she didn’t even know her own phone number and had to call her boyfriend, Oliver, over to give it to us. We were so obsessed with them that we made room to shoot them the next day. When we got there and Kira wasn't there (apparently she didn't know her schedule either), so we went back the night at 10pm to shoot them together. It was completely dreamy.

    How do you find new subjects?


    Lily: We are constantly finding new subjects everywhere. We are always scouting at events like concerts, fashion shows and art openings to find people. We also ask each person that we shoot to recommend friends that they think would be a good fit for the site. At this point, we are totally overwhelmed with possible subjects, as we have more than 5,000 people in our database. The network is growing organically at this point.

    What have you learned about style and personal expression in general through your interviews?


    Lily: That style is 100 percent personal and cannot—and should not—be dictated by what the media and marketing campaigns tell us to wear. Style can be accessible to anyone and everyone; looking stylish is more about your confidence and sense of self within than it is about the pieces of clothing that you are wearing.

    What are your personal styles?


    Lily: Because I am curvy, I wear a lot of dresses and skirts. One might call my style bohemian. I usually feel like my clothing is much more fitted for a California lifestyle than it is in the streets of New York. I can't stand structured clothing; it doesn't suit my personality to feel boxed in. My mom also has a bit of a hippy sensibility, but integrates more avant-garde/rock influences. She loves designers like Rick Owens and Martin Margiela. Because she has a more boy-ish body than I do, she incorporates a lot of menswear into her wardrobe.


    What fashion items can’t you live without?


    Lily: I can't live without long, printed dresses and moccasins.
    Elisa: I can't live without my Rick Owens leather jacket and piles of rings.

    How has your personal style been influenced by the interviews you’ve done and the people you’ve met through Stylelikeu?


    Lily: My mom and I both experienced certain seminal moments at the beginning of our journey interviewing people for Stylelikeu in which we were embraced our own individuality and becaome more comfortable in our style and skin. During our first interview ever, my mom was inspired by our muse, Erica Yarbrough, to accept her flat chestedness and throw away all of her padded bras. I was similarly inspired by some of the curvier girls on the site to stop trying to squeeze into the skinny jeans that the magazines had dictated were “in,” and rather, to begin wearing more skirts and dresses that were more suited to my body type. Being comfortable and not judging one’s self in comparison to some singular “ideal” is what is most exciting about style for me..


    How do you hope to influence/affect others with Stylelikeu?


    Lily: I hope people can access their own freedom of expression through clothing and not feel intimidated by or excluded from fashion. Fashion should not be a source of judgment, but should be a creative and joyful part of one's everyday life.
              ADO.NET Entity Framework – Code First Development Pattern   

    Originally posted on: http://geekswithblogs.net/archive/2011/04/14/ado.net-entity-framework-ndash-code-first-development-pattern.aspx

    Here’s a quick start introduction to the ADO.NET Entity Framework Code First Development Pattern.  For a more complete (but still intro) walkthrough, check out ScottGu’s blog.

    At the time of this writing, you’ll need the EF 4.1 Release Candidate.  After EF 4.1 is released you’ll no doubt be able to find it on the ADO.NET team blog.

    1.  Create a new empty ASP.NET MVC2 Web Application, add a reference to System.Data.Entity.  If using the EF4.1 RC - add a reference to the EntityFramework dll that comes installed with it.

    2.  Add the model classes you’d like to represent your data objects.  For example if you need to represent game nights, you might create a class called Game with properties that describe the game night date, location, and other information.

    3.  Create a “Context Class” that inherits DbContext, which contains nothing but DbSet<yourclassnamefromstep2> public properties.  For example if you created in step 2 a class called “Game”, here you’d have a property defined as:

    public DbSet<Game> Games { get; set; }

    At this point, EF code first will make certain assumptions (which we can override later if necessary) that allow us to write less code: